Event-Driven Development: Unlocking Optimized Dapps and Subgraphs

Welcome to the inaugural edition of The Graph Builders Blog! The Graph Builders Blog is a space where builders can come together, learn from each other, and share their knowledge with The Graph community at large. To learn more and apply to contribute to The Graph Builders Blog, go to the end of this post.

This contribution comes from Denver Baumgartner of Rubicon. Check out the subgraph examples in this post on Rubicon’s GitHub repo.

In this post, I'll introduce the concept of event-driven development (EDD), which can enhance smart contract and subgraph development when using The Graph.

But before we dive into event-driven development, let's go over a few fundamentals.

Understanding Events and Subgraphs' Event Handling

In Solidity, an event is a mechanism to log and notify the Ethereum blockchain and its users when a specific action occurs within a smart contract. When a smart contract emits an event, it creates a message stored on the blockchain. External applications can then access and query these logs, making events a convenient way to communicate state changes and trigger actions accordingly. Ethers.js, one of the most widely used web3 libraries, allows for the creation of event filters (only for indexed parameters) for this exact purpose, check out the docs here. Subgraphs, use eventHandlers to handle these events in subgraph mappings.

Events are defined using the event keyword followed by the event name and its arguments in parentheses. Here's an example of an event definition and the function that calls it:

pragma solidity ^0.8.9;
/// @notice Events contract for logging trade activity on Rubicon Market
/// @dev Provides the key event logs that are used in all core functionality of exchanging on the Rubicon Market
contract EventfulMarket {
// Define the event
event RecordOffer(
bytes32 indexed id,
bytes32 indexed pair,
address indexed maker,
ERC20 pay_gem,
ERC20 buy_gem,
uint128 pay_amt,
uint128 buy_amt
);
...
/// @notice Key function to make a new offer. Takes funds from the caller into market escrow.
function offer(
uint256 pay_amt,
ERC20 pay_gem,
uint256 buy_amt,
ERC20 buy_gem,
address owner,
address recipient
) public virtual can_offer synchronized returns (uint256 id) {
require(uint128(pay_amt) == pay_amt);
require(uint128(buy_amt) == buy_amt);
require(pay_amt > 0);
require(pay_gem != ERC20(address(0)));
require(buy_amt > 0);
require(buy_gem != ERC20(address(0)));
require(pay_gem != buy_gem);
OfferInfo memory info;
info.pay_amt = pay_amt;
info.pay_gem = pay_gem;
info.buy_amt = buy_amt;
info.buy_gem = buy_gem;
info.recipient = recipient;
info.owner = owner;
info.timestamp = uint64(block.timestamp);
id = _next_id();
offers[id] = info;
require(pay_gem.transferFrom(msg.sender, address(this), pay_amt));
emit LogItemUpdate(id);
emit RecordOffer(
bytes32(id),
keccak256(abi.encodePacked(pay_gem, buy_gem)),
msg.sender,
pay_gem,
buy_gem,
uint128(pay_amt),
uint128(buy_amt)
);
}
}

In this example, we define a RecordOffer event with the relevant parameters for our subgraph:

  • id: the unique id of the offer
  • pair: the pair of the offer (i.e ETH-USDC)
  • maker: the address that made the offer
  • pay_gem: the asset the offer is selling (or pay)
  • buy_gem: the asset the offer is buying
  • pay_amt: the amount to be paid (or sold)
  • buy_amt: the amount to be bought
  • the ratio of pay_amt / buy_amt determines the price of the offer

The offer function allows the user to create an offer stating the asset they would like to sell, the asset they would like to receive in return, and the amounts of each asset respectively (which determines the “price” of that offer). When the offer function is successfully called, the RecordOffer event is broadcasted with the relevant information from the offer itself. This lets us index the relevant information from that offer without calling chain state, improving efficiency of indexing and reducing resource costs.

The Graph has a wonderful set of tools to automate the generation of code needed to process events while indexing, namely codegen. There are a wealth of resources when it comes to developing subgraphs available through The Graph’s documentation that I would recommend checking out if you are wanting to become more familiar with subgraph development from a practical standpoint.

For the purposes of this article, all you need to know is that codegen takes a contract’s application binary interface (ABI) and creates classes for events that you can access in your mappings, such as the one seen below:

export class RecordOffer extends ethereum.Event {
get params(): RecordOffer__Params {
return new RecordOffer__Params(this);
}
}
export class RecordOffer__Params {
_event: RecordOffer;
constructor(event: RecordOffer) {
this._event = event;
}
get id(): Bytes {
return this._event.parameters[0].value.toBytes();
}
get pair(): Bytes {
return this._event.parameters[1].value.toBytes();
}
get maker(): Address {
return this._event.parameters[2].value.toAddress();
}
get pay_gem(): Address {
return this._event.parameters[3].value.toAddress();
}
get buy_gem(): Address {
return this._event.parameters[4].value.toAddress();
}
get pay_amt(): BigInt {
return this._event.parameters[5].value.toBigInt();
}
get buy_amt(): BigInt {
return this._event.parameters[6].value.toBigInt();
}
}

To review, we’ve quickly covered:

Now that you understand these fundamentals, let's introduce event-driven development!

What is Event-Driven Development (EDD)?

Event-Driven Development (EDD) is a development strategy that centers around the creation of data-rich smart contracts for decentralized applications (dapps). It lays a foundation of smart contract events with the end application or use case in mind, which in turn, helps to effectively deliver both current and historical data to users.

One of the key principles of EDD is ensuring the smart contracts emit rich and relevant events. This approach is integral to building a robust foundation for your dapp, and helps in optimizing the subgraph's indexing and querying process. This optimization significantly enhances the frontend operations and overall user experience, contributing to the scalability and usability of your dapp.

The emphasis on event-rich smart contracts helps to create a data-rich environment conducive for building and scaling dapps and subgraphs. Let's go into how this simple concept can have a cascading effect on the further development of your dapp.

Advantages of EDD

  1. Improved Dapp Planning: By thinking critically about specific events, the data that they contain, and the timing at which they are emitted prior to writing the front end of your dapp, you avoid wasting time by having to refactor and can more confidently plan the user experience.
  2. Scalability and Responsiveness: EDD is advantageous for subgraph optimization, as subgraphs index events more rapidly with eventHandlers as opposed to callHandlers or blockHandlers. If your dapp needs to scale, writing your smart contracts so that each key moment emits a rich event with all relevant metadata attached will allow your subgraph to scale along with your data demands.
  3. Maintainability: The modular structure of EDD simplifies code maintenance and debugging. By organizing the codebase around events, developers can more easily pinpoint and resolve issues, reducing the time and effort needed for troubleshooting.
  4. Enhanced Collaboration: Consider multiple smart contracts with multiple subgraphs currently indexing the smart contracts' data, each subgraph having its own mappings and schemas. This situation can quickly become overwhelming unless a strategy is considered early in development. Integrating EDD during your development process makes events a key focus for discussion and optimization between your smart contract team and subgraph team.

Another major advantage of EDD is cost.

In practice, our team has been able to significantly reduce our cost of supporting a production-grade application by reducing application RPC usage drastically. For example, you can go from 1000 users incurring a single RPC call to get the current balance of a pool to 1000 GraphQL calls to a PostgreSQL server. Furthermore, this PostgreSQL can be customized to your specific subgraph. Sharding, optimizing for cache hits, and a variety of other Indexer tricks are a separate post in and of itself.


In short, it pays to scale your protocol with subgraphs in mind.


Using EDD with Smart Contracts

Now that you understand EDD, let's go through how we use EDD at Rubicon Finance, starting with our smart contracts.

To incorporate an event-driven mindset into our smart contracts, we started from the perspective of our dapp’s interface and worked backward down the stack.

In the context of the event and functions outlined above, we know that users are going to not only want to place offers, but also see past offers and the status of any outstanding offers. Therefore, in our contracts, whenever an update occurs to the offer (someone trades with the maker of the offer, the offer maker cancels the offer, or the offer matches with another offer), we make sure to emit all relevant information to that state change of that offer as an event.

By determining these events ahead of time, we are able to utilize EDD to plan out the entire data flow of our application during the creation of the smart contracts themselves.

From a schema perspective, we are able to create a lot of important data structures that can drive different components of the application. An excellent example of this is candlestick data:

Because our contracts are emitting trade data as trades occur, we are able to update in real time the candlestick chart view simply by mapping those events to entities in our subgraphs!

From here, our dapp can utilize live-polling through the graph-client to listen to updates on specific subgraph entities, like the most recent candlestick, and update the UI accordingly! This is an incredibly efficient way to retrieve and display data from the chain. Not only are you reacting to events as they occur, but you are also removing the need to run, or pay for, a node to retrieve data.

If you haven’t tried out graph-client before, I highly recommend looking into it. There are a variety of useful configurations, like fall-back patterns, that allow you to manage a network of subgraphs and indexers in order to provide user’s a seamless, high up-time, experience.

Events also allow you to recreate the current state of smart contracts in your subgraph mappings and can help for historical analysis when you want to check the state of a contract at a certain block in time. An example of this is tracking the balance of a user in a Rubicon liquidity pool. If we know every deposit, withdrawal, and transfer that has occurred, we can determine a user’s balance at any point in time. Time travel queries are helpful to this end, but remove the ability to prune subgraphs, which can improve performance at scale (especially if entities are updated, if not, make them immutable). In practice, we have found it helpful to break different application needs across multiple subgraphs.

In short, we have found it best to follow a development process that looks something like the following:

  1. Decide what the User Interface for the application should look like, and what the User Experience will be as user’s navigate through the application. All of this is contingent upon the underlying functionality of the smart contracts, for Rubicon this is the order book protocol and associated liquidity pools.
  2. Define a schema that enables the User Interface that we have designed. Once you have mocked up your interface, it is relatively easy to know what data you will need to populate the application. When in doubt, simpler is better.
  3. Implement smart contract events that are able to populate the schema we have defined. This can sometimes be tricky, especially as the protocol increases in complexity, and necessitates an intimate understanding of the underlying smart contracts to know where and when events should be emitted. Creating a diagram of all potential user interactions with the smart contracts can help during this phase of the process.

Utilizing the approach above, Rubicon has been able to build an application that fully relies upon subgraphs for its backend, removing the need to depend upon a centralized backend and setting the foundation for a fully decentralized application.

A note on nodes. Everyone, if able, should run a node. Client Diversity is not just important, it's critical to the success of our collective efforts to build a truly decentralized internet.

Using EDD with Upgradeable Proxy Contracts

If you aren’t familiar with Upgradeable Proxy Contracts and would like to know more, OpenZeppelin has a wealth of knowledge on this topic. In short, Upgradeable Proxy Contracts let you update your smart contracts over time by separating core logic from the contract that user’s access. This allows logic to be updated while the access point remains the same. This is great, but can present some odd behavior when updating contract events.

There are a variety of scenarios where you would want to upgrade an event to include more, less, or different data. Utilizing Proxy Contracts, this is really easy to do. So, you go in, move some variables around, update your contracts, only to discover when updating your mappings that you can’t index the same event name with different parameter structures! This is an unfortunate reality for many subgraph developers, and one we hope you can avoid by following good EDD practices.


In short, when you have used an event in the past, and want to update that event to get new data, RENAME THE EVENT. This will allow you to continue to index legacy data, while also indexing that new event. To enable this, you will need to modify your updated contract ABI to include this legacy event.


In this example, LogMake is a legacy event from v1 of the RubiconMarket.sol contract we have been exploring in this article. Timestamp is a variable that can be retrieved from the transaction itself, and as such doesn’t need to be included in the event. In order to modify this event, without losing backward compatibility, we simply deprecate the old event in favor of a newly named event as shown below.

/// V1 Legacy event that is being depreciated:
/// event LogMake(
/// bytes32 indexed id,
/// bytes32 indexed pair,
/// address indexed maker,
/// ERC20 pay_gem,
/// ERC20 buy_gem,
/// uint128 pay_amt,
/// uint128 buy_amt,
/// uint64 timestamp
/// );
/// V2 currently used event, ABI of LogMake altered to reflect changes:
event RecordOffer(
bytes32 indexed id,
bytes32 indexed pair,
address indexed maker,
ERC20 pay_gem,
ERC20 buy_gem,
uint128 pay_amt,
uint128 buy_amt
);

Now, we need to perform some modifications to our contract ABIs. To do so, we simply find the LogMake portion of our ABI from v1 of the protocol, and update our v2 ABI in our subgraph to include said event.

{
"anonymous":false,
"inputs":[
{
"indexed":true,
"internalType":"bytes32",
"name":"id",
"type":"bytes32"
},
{
"indexed":true,
"internalType":"bytes32",
"name":"pair",
"type":"bytes32"
},
{
"indexed":true,
"internalType":"address",
"name":"maker",
"type":"address"
},
{
"indexed":false,
"internalType":"contract ERC20",
"name":"pay_gem",
"type":"address"
},
{
"indexed":false,
"internalType":"contract ERC20",
"name":"buy_gem",
"type":"address"
},
{
"indexed":false,
"internalType":"uint128",
"name":"pay_amt",
"type":"uint128"
},
{
"indexed":false,
"internalType":"uint128",
"name":"buy_amt",
"type":"uint128"
},
{
"indexed":false,
"internalType":"uint64",
"name":"timestamp",
"type":"uint64"
}
],
"name":"LogMake",
"type":"event"
},

Now, within our newly updated ABI, we can update our subgraph.yaml file to ensure that we cover both legacy v1 events, and live v2 events. This turns into something like the following:

abis:
- name: RubiconMarket
file: ./abis/RubiconMarketOptimism.json
eventHandlers:
- event: RecordOffer(indexed bytes32,indexed bytes32,indexed address,address,address,uint128,uint128)
handler: handleOffer
- event: RecordTake(indexed bytes32,indexed bytes32,indexed address,address,address,address,uint128,uint128)
handler: handleTake
- event: RecordCancel(indexed bytes32,indexed bytes32,indexed address,address,address,uint128,uint128)
handler: handleCancel
- event: RecordFee(indexed bytes32,indexed address,indexed address,bytes32,address,uint256)
handler: handleFee
- event: RecordDelete(indexed bytes32,indexed bytes32,indexed address)
handler: handleDelete
# these are events from the v1 protocol stack, and are included to ensure data congruity
- event: LogMake(indexed bytes32,indexed bytes32,indexed address,address,address,uint128,uint128,uint64)
handler: handleLogMake
- event: LogTake(bytes32,indexed bytes32,indexed address,address,address,indexed address,uint128,uint128,uint64)
handler: handleLogTake
- event: LogKill(indexed bytes32,indexed bytes32,indexed address,address,address,uint128,uint128,uint64)
handler: handleLogKill
- event: OfferDeleted(indexed bytes32)
handler: handleOfferDeleted
- event: FeeTake(indexed bytes32,indexed bytes32,address,indexed address,address,uint256,uint64)
handler: handleFeeTake

And that's it!

By using proxies, we are able to update our contracts between versions without switching the main access address that the ecosystem has been built around. By following event-driven development principles, we are able to simply modify our subgraphs to index both legacy and current events, meaning our application can work off of existing schema structures. Better yet, we don’t lose all of our data when protocol upgrades happen.

Rubicon has taken advantage of EDD to provide a seamless user experience as we update our contracts from v1 of the protocol to v2. This means that user’s are able to interact with the same interface, access all of their historical data, and utilize the newest version of the protocol without having to do a thing. This is something we are incredibly proud of and a continued commitment to delivering the best DeFi experience possible, all thanks to EDD.

While EDD may not be a one-size-fits-all solution, it's worth considering as part of your dapp development journey. By identifying critical events, designing event-driven smart contracts, and using subgraphs, you can create scalable, maintainable, and responsive decentralized applications. We invite you to explore the potential of EDD and assess whether it could be a valuable addition to your development toolbox.

About Rubicon

Rubicon is an order book protocol built on Ethereum with a mission to accelerate and democratize global finance by building better markets for humanity. Order books help to deal with asymmetric information in market places, and over time have become the most widely used system in traditional finance. Fully open sourced and accessible to all, Rubicon is committed to bringing fair and credibly neutral markets to the world. If Ethereum is a World Computer, Rubicon is a World Order Book.


Contribute to The Graph Builders Blog

As a decentralized project, we firmly believe that sharing knowledge is crucial for the growth and development of the entire ecosystem. The Graph Builders Blog is a platform for developers and those building with The Graph ecosystem to share their insights, experiences, and best practices related to building decentralized applications with The Graph.

By contributing to The Graph Builders Blog, you will have the opportunity to showcase your expertise, share solutions, and gain exposure to a community of like-minded builders. We are confident that your insights will inspire and educate others, as well as contribute to the vision of a fully decentralized future.

Perks of being an author on The Graph Builders Blog

  • Once The Graph editors approve your blog, you will be featured on The Graph site, with reach to hundreds of thousands of readers, with you named as the author of the blog
  • We will highlight and tag you in social media post, including nearly 300k twitter followers
  • You will receive a “The Graph Builders Blog Author” POAP
  • You will be able to add The Graph Builders Blog author to your LinkedIn and resume

To apply to be a contributor to The Graph Builders Blog, fill out this form and we will get back to you.

About The Graph

The Graph is the source of data and information for the decentralized internet. As the original decentralized data marketplace that introduced and standardized subgraphs, The Graph has become web3’s method of indexing and accessing blockchain data. Since its launch in 2018, tens of thousands of developers have built subgraphs for dapps across 50+ blockchains - including  Ethereum, Arbitrum, Optimism, Base, Polygon, Celo, Fantom, Gnosis, and Avalanche.

As demand for data in web3 continues to grow, The Graph enters a New Era with a more expansive vision including new data services and query languages, ensuring the decentralized protocol can serve any use case - now and into the future.

Discover more about how The Graph is shaping the future of decentralized physical infrastructure networks (DePIN) and stay connected with the community. Follow The Graph on X, LinkedIn, Instagram, Facebook, Reddit, and Medium. Join the community on The Graph’s Telegram, join technical discussions on The Graph’s Discord.

The Graph Foundation oversees The Graph Network. The Graph Foundation is overseen by the Technical Council. Edge & Node, StreamingFast, Semiotic Labs, The Guild, Messari, GraphOps, Pinax and Geo are eight of the many organizations within The Graph ecosystem.


Category
Graph Builders
Author
Denver Baumgartner
Published
June 9, 2023

Denver Baumgartner

View all blog posts