Skip to content

shuva10v/ton-analytics

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ton Analytics code

Various code for TON blockchain analysis.

Initial blockchain data extraction is performed with ton-indexer. PostgreSQL is not good enough for huge analytical workload so data loaded into S3-based datalake. Airflow is used for incremental ETL process:

  • E: extract data from psql sliced by date intervals
  • T: convert results to parquet file format
  • L: load it into S3 object storage

Datalake tables:

Table Prefix
accounts dwh/staging/accounts/date=YYYYMM/
transactions dwh/staging/transactions/date=YYYYMM/
messages dwh/staging/messages/date=YYYYMM/

After each incremental upload log entry created on psql table ``increment_state```

Smart-contract data/messages parser for extracting meaningful data, for example, NFT owners. Example:

POSTGRES_URI=postgres://postgres:pass@localhost:5432/ton_index \
  python3 contracts_parser.py -c jetton -d cache/ -e http://localhost:9090/execute \
  jetton_contract_data.csv jetton_parsed.csv

Supported contract types:

Utils

Analysis

Some analysis scripts:

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published