Blockflow Docs
AppBlog
CLI
CLI
  • Getting Started
    • Installation
    • Quickstart Guide
    • Project Structure
  • Building Blocks
    • Configuring YAML
    • Setting up Database Schema
    • Setting up Handlers
  • Writing handlers
    • Database Operations
      • Instance
      • API
      • Native
    • Indexing Factory Contracts
  • Production
    • Testing
    • Deployment
    • Query The Database
    • Building REST API
    • Building GraphQL APIs
  • Advanced
    • Migrate Subgraph
    • CLI Cheatsheet
    • Console Account
    • Logging
  • Hands-on Project
    • CCTP Protocol
    • ERC-4626
    • Credit-Debit
    • Avocado
    • ENS
    • SolverScan
    • xERC20
    • Snapshot
  • Glossary
    • IEventContext
    • IFunctionContext
    • ILog
    • ITransaction
    • IBlock
Powered by GitBook
On this page
  • Top-Level Folders
  • studio.yaml
  • studio.schema.ts
  • handlers
  • abis
  • docker-compose.yml
  • .env.example
  1. Getting Started

Project Structure

PreviousQuickstart GuideNextBuilding Blocks

Last updated 7 months ago

This page provides an overview of the project structure of a blockflow project. It covers top-level files and folders, configuration file, and schema file.

Click the file and folder names to learn more about each convention.


Top-Level Folders

After completing the project initiation step, the CLI generates a folder structure with all the necessary boilerplate code set up. Let's explore this folder structure in detail:


studio.yaml

The studio.yaml file is a crucial component of your project. It serves as the main configuration file, defining the structure and behaviour of your indexer.

Version: 2.0.0
Type: instance
Metadata:
  name: Project Apollo
  description: A top-secret research project to the moon
Auth:
  userId: process.env.USER_ID
  projectId: process.env.PROJECT_ID
  accessKey: process.env.ACCESS_KEY
  secretKey: process.env.SECRET_KEY
Path:
  schema: ./studio.schema.ts
  docker: ./docker-compose.yml
Environment:
  testing:
    startBlock: latest
    network: Ethereum
    execution: parallel
    rpc: ENTER_YOUR_RPC_HERE
    range: 10
  deployment:
    startBlock: latest
    network: Ethereum
    execution: parallel
Resources:
  - name: usdc
    type: contract/event
    abi: src/abis/usdc.json
    address: '0xdAC17F958D2ee523a2206206994597C13D831ec7'
    triggers:
      - event: Transfer(address indexed,address indexed,uint256)
        handler: src/handlers/usdc/Transfer.TransferHandler

This YAML file specifies the details, including:

  • The blockchain network it will connect to (e.g., Ethereum, Polygon, etc.)

  • The smart contracts it will index and their addresses

  • The events and data entities to be indexed from those contracts

  • The mapping functions that transform the raw blockchain data into the desired format


studio.schema.ts

The studio.schema.ts file is a TypeScript file responsible for defining the databases configuration and their schema. It specifies the entities and their relationships within the indexed blockchain data.

const Transfer = {
  name: "Transfer",
  db: "mongodb",
  type: "managed",
  reorg: true,
  properties: {
    from: "string?",
    to: "string?",
    amount: "string?",
  },
};

module.exports = { Transfer };

In this file, you define the structure of the data entities that will be stored. And also the configuration of database. These entities represent the various types of data you want to index from the blockchain, such as smart contracts, events, and any relevant data associated with them.

Remember no comments are allowed in studio.schema.ts

Once done with defining the schema. You need to run below code to get started.

blockflow typegen

A types folder will generated inside src folder containing generated.ts file and graphql.ts file.


handlers

The mapping functions defined in the handlers directory are automatically triggered and executed whenever a specific event or function is emitted or called on the blockchain. Each mapping function is associated with a particular event or function from the smart contracts being indexed defined in studio.yaml file.

To generate boilerplate code for these handler use below command.

blockflow codegen

Here, an example of handler's boilerplate code.

import {
  IEventContext,
  ISecrets,
} from "@blockflow-labs/utils";

/**
 * @dev Event::Transfer(address from, address to, uint256 value)
 * @param context trigger object with contains {event: {from ,to ,value }, transaction, block, log}
 * @param bind init function for database wrapper methods
 */
export const TransferHandler = async (
  context: IEventContext,
  bind: any,
  secrets: ISecrets,
) => {
  // Implement your event handler logic for Transfer here

  const { event, transaction, block, log } = context;
  const { from, to, value } = event;
};

We will dive deeper into this code in handlers section.


abis

The abis folder is used to store the Application Binary Interface (ABI) files for the smart contracts being indexed.


docker-compose.yml

The root directory also contains a docker-compose.yml file, which defines the services required for running a local testing environment. This file includes configurations for MongoDB, PostgreSQL, and Adminer.

Note: It is recommended to leave this file unchanged to ensure your environment runs smoothly without errors.

services:
  mongodb:
    image: mongo:latest
    environment:
      MONGO_URL: ${MONGO_URL}
    ports:
      - "27017:27017"
    volumes:
      - mongodb_data:/data/db

  postgres:
    image: postgres:latest
    environment:
      POSTGRES_USER: ${POSTGRES_USER}
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
    ports:
      - "5432:5432"
    volumes:
      - postgres_data:/var/lib/postgresql/data

  adminer:
    image: adminer
    restart: always
    ports:
      - 8080:8080
    environment:
      ADMINER_DEFAULT_SERVER: postgres

volumes:
  mongodb_data:
  postgres_data:


.env.example

A .env.example file is generated in the root directory as a sample configuration file. This file outlines the environment variables needed for your project. Before you start building or testing, rename it to .env.

Make sure to fill in the required credentials, especially before deploying your project. For local testing, you can leave the default values, but if you need to connect to remote databases, feel free to update the database configuration variables accordingly.

USER_ID=XXXX-XXXX-XXXX-XXXX
PROJECT_ID=XXXX-XXXX-XXXX-XXXX
ACCESS_KEY=XXXX-XXXX-XXXX-XXXX
SECRET_KEY=XXXX-XXXX-XXXX-XXXX
MONGO_URL="mongodb://localhost:27017/"
POSTGRES_USER="blockflow"
POSTGRES_PASSWORD="blockflow-test"