Avocado

In this tutorial, we will show you how to index transactions on the avocado ecosystem. Avocado is the next-generation smart contract wallet, which enables you to perform multi-network transactions, with an in-built account abstraction utility - while you're connected to a single network, Avocado. This completely eliminates the need of switching between different networks.


Project Overview

The purpose of this tutorial is to teach you how you can index transactions broadcasted by avocado by creating the subgraphs with the aid of Blockflow's utilities. A short breakdown on the upcoming steps is -

  • Creating the project directory of your choice and cd into it. Initializing the blockflow project in the directory.

  • Supplying the appropriate schema.

  • Writing the handler functions and add data into a utils.ts file for code refactoring.

  • Test the code and verify the data being populated into your MongoDB.


Setting up the project

  • Download the Blockflow CLI using the command:npm i -g @blockflow-labs/cli​

  • Create a project directory and cd into it to make it your working directory.

  • Use the blockflow init command to initialize the project and get the standard Blockflow template.

  • Below are the fields that need to be entered for the project. You can also let the default values for each field by pressing [ENTER]. The last field requires you to select the instance trigger type with 4 choices. You can also add more than one trigger for a project.

The default values for the blockflow init command

Setting up the schema

The below-mentioned schema has field which track broadcaster, network actions, etc. which are relevant for a transaction that has taken place. We also use an action type for a field which has multiple values because of being a transaction action field.

type Action = {
  value: string;
  to: string;
  from: string;
  address: string;
};

export interface avoData {
  id: String;
  transactionHash: string;
  broadcaster: string;
  status: string;
  time: string;
  network: string;
  actions: [Action];
  user: string;
  avocadoWallet: string;
}

On filling the studio.schema.ts with the appropriate schema, use the command blockflow typegen to update the schema.ts file in src/types. Now, we will move onto writing the handler functions.


Writing the handler function

There's a single event Executed which we need to track while writing code in the executed.ts. Use the command blockflow codegen to generate the template in the executed.ts file in src/handlers. We first import the schema using the import method:

import { avoData, IavoData } from "../types/schema";
import { getAllTransactionActions } from "../utils";

This imports the avoData schema we created earlier. We also import the getAllTransactionActions function, which will be explained later. Below the template, which was auto-generated by blockflow-cli, we write the handler code logic. First, we bind the database connection and create a unique ID for each transaction that takes place.

export const ExecutedHandler = async (
  context: IEventContext,
  bind: IBind,
  secrets: ISecrets
) => {
  const { event, transaction, block, log } = context;
  const { avoSafeOwner, avoSafeAddress, source, metadata } = event;
  //template code ends here

  //binding the DB
  const avoDataDB: Instance = bind(avoData);
  //creating a unique id 
  const id = transaction.transaction_hash + ":" + log.log_index.toString();

Now, we add the getAllTransactionAction function to get the transaction action data depending on each transaction log.

const actions = getAllTransactionActions(transaction.logs);

Going onto the src/utils file, we have an index.ts file which contains data regarding the above function. The file contains the topic0 value for the Transfer event that occurs. Inside the function logic, it gets the transfer event data my matching the topic0 value with various different logs of the transaction. It then returns the data asscociated with the transfer event in the form of topics. Below is the code for the index.ts:

import { Interface } from "ethers";
import { ILog } from "@blockflow-labs/utils";

import erc20 from "../abis/erc20.json";

// prettier-ignore
const TOPIC_0 = "0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef";

export function getAllTransactionActions(logs: Array<ILog>) {
  const transferlogs = logs
    ? logs.filter(
        (log) => log.topics[0].toLowerCase() === TOPIC_0.toLowerCase()
      )
    : [];

  return transferlogs.map((log) => {
    const decodedLog: any = decodeTransferLog(log.topics, log.log_data);
    return {
      from: decodedLog[0],
      to: decodedLog[1],
      value: decodedLog[2].toString(),
    };
  });
}

export function decodeTransferLog(topics: Array<string>, data: string) {
  const iface = new Interface(erc20);
  return iface.parseLog({ topics, data })?.args;
}

Now we move onto creating the database table and populating the field values. We get the transactionHash, broadcaster, time from blockflow's util which hold the transaction data.

await avoDataDB.create({
    id: id,
    transactionHash: transaction.transaction_hash,
    broadcaster: transaction.transaction_from_address,
    status: "Success",
    time: block.block_timestamp.toString(),
    network: "ETH",
    actions,
    user: avoSafeOwner,
    avocadoWallet: avoSafeAddress,
  });

Testing

We use the below command to test if the handler logic is correct and the data gets stored on our specified collection in Mongo.

blockflow instance-test --startBlock --clean --range 10 --uri

The <block number> can be put of any range, but we can put it to latest if no opinion provided. The — uri holds the MongoDB connection URL which is tested locally shall be filled with mongodb://localhost:27017/blockflow_studio.

To check the complete code for indexing Avocado, check out Blockflow's repository on github.

Last updated