Loading...
11 February 2026
About The Company

At Delta, we are reimagining and rebuilding the financial system. Join our team to make a positive impact on the future of finance.

🎯 Mission Driven: Re-imagine and rebuild the future of finance.

💡 Most innovative cryptocurrency derivatives exchange. With a daily traded volume of :$ 3.5 billion, and increasing. Delta is bigger than all the Indian crypto exchanges combined.

📈 Offer the widest range of derivative products and have been serving traders all over the globe since 2018 and growing fast.

💪🏻 The founding team is comprised of IIT and ISB graduates. Business co-founders have previously worked with Citibank, UBS and GIC; and our tech co-founder is a serial entrepreneur who previously co-founded TinyOwl and Housing.com.

💰 Funded by top crypto funds (Sino Global Capital, CoinFund, Gumi Cryptos) and crypto projects (Aave and Kyber Network).

Role Summary:

Support our analytics team by owning the full ETL lifecycle—from master data to analytics-ready datasets. You will build and maintain daily batch pipelines that process 1-10 million master-data rows per run (and scale up to tens or hundreds of millions of rows), all within sub-hourly SLAs. Extract from OLTP and time-series sources, apply SQL/stored-procedure logic or Python transformations, then load into partitioned, indexed analytics tables. Reads run exclusively on read-only replicas to guarantee zero impact on the production master DB. You’ll also implement monitoring, alerting, retries, and robust error handling to ensure near-real-time dashboard refreshes.

Requirements

Required Skills & Experience:

  • 4+ years in data engineering or analytics roles, building daily batch ETL pipelines at 1-10 M rows/run scale (and up to 100 M+)
  • Expert SQL skills, including stored procedures and query optimisation on Postgres, Mysql, or similar RDBMS
  • Proficient in Python for data transformation (pandas, NumPy, SQLAlchemy, psycopg2)
  • Hands-on with CDC/incremental load patterns and batch schedulers (Airflow, cron)
  • Deep understanding of replicas, partitioning, and indexing strategies
  • Strong computer-science fundamentals and deep knowledge of database internals—including storage engines, indexing mechanisms, query execution plans and optimisers for MySQL and time-series DBs like TimescaleDB
  • Experience setting up monitoring and alerting (Prometheus, Grafana, etc.)

Key Responsibilities:

  • Nightly Batch Jobs: Schedule and execute ETL runs
  • In-Database Transformations: Write optimised SQL and stored procedures
  • Python Orchestration: Develop Python scripts for more complex analytics transformations
  • Data Loading & Modelling: Load cleansed data into partitioned, indexed analytics schemas designed for fast querying
  • Performance SLAs: Deliver end-to-end sub-hourly runtimes
  • Monitoring & Resilience:Implement pipeline health checks, metrics, alerting, automatic retries, and robust error handling
  • Stakeholder Collaboration: Work closely with analysts to validate data quality and ensure timely delivery of analytics-ready datasets
Employment Type
On-site
Delta Exchange
View profile

Related Jobs

Other similar jobs that might interest you