xERC20

In this tutorial, we will show you how to index xERC20 data using the Blockflow CLI. Blockflow provides a seamless and efficient solution for building "Serverless Backends" through our Managed Databases, Instances, and Custom APIs.

xERC20 is a cool new token standard from Connext that makes cross-chain interactions super smooth. Unlike regular ERC20 tokens, xERC20 tokens work across different blockchains, boosting both interoperability and liquidity. They use lockbox contracts to handle the locking and unlocking of tokens securely during cross-chain transactions. This means you can move assets between blockchains without relying on centralized exchanges, cutting down on transaction times and fees. Overall, xERC20 tokens are helping create a more connected and efficient decentralized finance (DeFi) world.

Project Overview

The purpose of this tutorial is to teach you how to index XERC20 data by creating subgraphs with the aid of Blockflow and its utilities. A short breakdown for the project is -

  • Create a project directory and cd into it. Use the command blockflow init through the terminal and provide with the relevant contract address to be tracked.

  • Provide with appropriate schema to the studio.schema.ts file depending upon the data emitted by the events to be tracked.

  • Write the handler functions for tracking each event.

  • Test the project using the comand blockflow test and verify the data stored over the database.


Setting up the project

  • Download the Blockflow CLI using the command:

    npm i -g @blockflow-labs/cli

    We can remove the -g tag if global installation is not required.

  • Create a project directory and use the cd command over it to make it the working directory.

  • Use the blockflow init command to initiate the project.

  • Below are the fields that need to be entered for the project. You can also let the default values for each field by pressing [ENTER]. The last field requires you to select the instance trigger type with 4 choices. You can also add more than one trigger for a project.

Setting up the schema

The below mentioned schema is appropriate to data for an xERC20 and xERC20 Lockbox contract events:

import { String } from "@blockflow-labs/utils";

export interface Transfer {
  id: String;
  from_address: string;
  to_address: string;
  token_address: string;
  token_name: string;
  token_symbol: string;
  raw_amount: Number;
  raw_amount_str: string;
  amount: Number;
  amount_str: string;
  usd_amount: Number;
  usd_exchange_rate: string;
  transfer_type: string;
  transaction_from_address: string;
  transaction_to_address: string;
  transaction_hash: string;
  log_index: string;
  block_timestamp: string;
  block_hash: string;
}

export interface Balance {
  id: String;
  address: string;
  token_address: string;
  token_name: string;
  token_symbol: string;
  balance: string;
  raw_balance: string;
  usd_amount: string;
  usd_exchange_rate: string;
  block_timestamp: string;
  block_hash: string;
  is_past_holder: Boolean;
  is_holder: Boolean;
}

export interface Token {
  id: String;
  address: string;
  decimals: string;
  name: string;
  symbol: string;
  holder_count: string;
  burn_event_count: string;
  mint_event_count: string;
  transfer_event_count: string;
  total_supply: string;
  total_burned: string;
  total_minted: string;
  total_transferred: string;
}

export interface BridgeLimitsSet {
  id: String;
  mintingLimit: Number;
  burningLimit: Number;
  bridge: string;
  block_timestamp: string;
  block_hash: string;
  block_number: string;
}

export interface LockBoxSet {
  id: String;
  lockboxaddress: string;
  block_timestamp: string;
  block_hash: string;
  block_number: string;
}

export interface LockBoxData{
  id: String;
  lockboxaddress: string;
  senderAccount: string;
  depositedAmount: number;
  withdrawnAmount: number;
  netAmount: number;
}

On completing the schema, use the command blockflow typegen to generate types.

Writing the handler function

Now we move onto writing the handler functions but first remove the unwanted event triggers from studio.yaml file. Below is the studio.yaml with events that tracks the data fulfilling our schema above:

name: Project Apollo
description: A top-secret research project to the moon
startBlock: latest
userId: XXXXXXXX-XXXX-XXXX-XXXXXXXX-XXXXXXXX
projectId: XXXXXXXX-XXXX-XXXX-XXXXXXXX-XXXXXXXX
network: Ethereum
user: Jane-doe
schema:
  file: ./studio.schema.ts
execution: parallel
Resources:
  - Name: blockflow
    Abi: src/abis/blockflow.json
    Type: contract/event
    Address: "0xeA581cA64e4A384aE4dEA39bb083173CcBd2D817"
    Triggers:
      - Event: BridgeLimitsSet(uint256,uint256,address indexed)
        Handler: src/handlers/blockflow/BridgeLimitsSet.BridgeLimitsSetHandler
      - Event: LockboxSet(address)
        Handler: src/handlers/blockflow/LockboxSet.LockboxSetHandler
      - Event: Transfer(address indexed,address indexed,uint256)
        Handler: src/handlers/blockflow/Transfer.TransferHandler
  - Name: lockbox
    Abi: src/abis/lockbox.json
    Type: contract/event
    Address: "0x9141776017D6A8a8522f913fddFAcAe3e84a7CDb"
    Triggers:
      - Event: Deposit(address,uint256)
        Handler: src/handlers/lockbox/Deposit.DepositHandler
      - Event: Withdraw(address,uint256)
        Handler: src/handlers/lockbox/Withdraw.WithdrawHandler

Use the command blockflow codegen to generate handler templates in the /src/handlers directory.

There are different files that are generated in the handlers directory. We will look into the Deposit.ts file on how funds are deposited into the vault and shares are received by the user at the same time. The first part of the code includes imports where we import the necessary schemas we had generated in the src/types/schema.ts file earlier.

We will now go onto the file Transfer.ts and import the necessary functions and schemas as shown below:

import { BigNumber } from "bignumber.js";
import { getTokenMetadata } from "../../utils/tokens";
import { ITransfer, Transfer } from "../../types/schema";
import { IBalance, Balance } from "../../types/schema";
import { IToken, Token } from "../../types/schema";

We now have to write the handler code logic below the boiler plate code that was auto-generated in the handler. At first we bind the database connections for all DB's together. We also assign variable values to the to ad from address values and create the database Id called the transactionId.

const ZERO_ADDR = "0x0000000000000000000000000000000000000000";
  const tokenAddress = log.log_address;
  const fromAddress = from.toLowerCase();
  const toAddress = to.toLowerCase();
  const transferType =
    fromAddress === ZERO_ADDR
      ? "mint"
      : toAddress === ZERO_ADDR
        ? "burn"
        : "transfer";
  const tokenMetadata = getTokenMetadata(tokenAddress);
  const transactionId =
    `${transaction.transaction_hash.toString()}:${log.log_index.toString()}`.toLowerCase();
  const tokenDecimals = parseInt(tokenMetadata.decimals.toString());
  const amount = new BigNumber(value).dividedBy(10 ** tokenDecimals);

  const transferDB: Instance = bind(Transfer);
  const balanceDB: Instance = bind(Balance);
  const tokenDB: Instance = bind(Token);

We now create an updateBalance function whcih takes certai parameters that are used in the logic. We create a userTokenId by concatenating address and tokenaddress. We assign a user variable the value of a document attached to the specific usertokenId. On not finiding any data, we use the create() method to create a new document over the balanceDB and assign values to it but on finding a user, we update the field values of address, token_address,etc. as shown below.

const updateBalance = async (
    balanceDB: Instance,
    tokenAddress: string,
    address: string,
    value: string,
    block: IBlock,
    isSender: boolean,
  ): Promise<IUpdateBalanceResult> => {
    const tokenMetadata = getTokenMetadata(tokenAddress);
    let isFirstTimeHolder = false;
    let isActiveHolder = true;

    const userTokenId = `${address}-${tokenAddress}`.toLowerCase();
    let user: IBalance = await balanceDB.findOne({ id: userTokenId });

    if (!user) {
      user ??= await balanceDB.create({
        id: userTokenId,
        is_past_holder: true,
        is_holder: true,
      });

      isFirstTimeHolder = true;
    }
    user.raw_balance = new BigNumber(user.raw_balance || "0")
      .plus(isSender ? `-${value}` : value)
      .toString();

    const tokenDecimals = parseInt(tokenMetadata.decimals.toString());

    const balance = new BigNumber(user.raw_balance)
      .dividedBy(10 ** tokenDecimals)
      .toString();

    user.address = address;
    user.token_address = tokenAddress;
    user.token_name = tokenMetadata.name;
    user.token_symbol = tokenMetadata.symbol;
    user.balance = balance;
    user.usd_amount = balance;
    user.usd_exchange_rate = balance;
    user.block_timestamp = block.block_timestamp;
    user.block_hash = block.block_hash;

    return { user, isFirstTimeHolder, isActiveHolder };
  };

Now, we create two variables - senderResult and receiverresult and give the parameters to each for using the updateBalance function and then save the resluts to the DB using the await method.

const senderResult: IUpdateBalanceResult = await updateBalance(
    balanceDB,
    tokenAddress,
    fromAddress,
    value,
    block,
    true,
  );
  const receiverResult: IUpdateBalanceResult = await updateBalance(
    balanceDB,
    tokenAddress,
    toAddress,
    value,
    block,
    false,
  );

  await Promise.all([
    balanceDB.save(senderResult.user),
    balanceDB.save(receiverResult.user),
  ]);

We do this for the other two DBs - TransferDB and TokenDB.

Testing

We use the below command to test if the handler logic is correct and the data gets stored on our specified collection in Mongo.

blockflow instance-test --startBlock <block number> --clean --range 10 --uri <connection string>

The <block number> can be put of any range, but we can put it to latest if no opinion provided. The — uri holds the MongoDB connection URL which is tested locally shall be filled with mongodb://localhost:27017/blockflow_studio.

Last updated