About The Position

The engineering team at Chainalysis is inspired by solving the hardest technical challenges and creating products that build trust in cryptocurrencies. We’re a global organization with teams in the UK, Denmark, Canada, and the USA who thrive on the challenging work we do and doing it with other exceptionally talented teammates. Our industry changes every day and our job is to build a flexible platform that will allow us to adapt to those rapid changes. We’re looking for a Senior Data Platform Engineer to join our Data Cloud team. This group accelerates innovation & helps unlock new business lines, by empowering anyone at Chainalysis to quickly and reliably discover, access, analyze, and build on top of any and all data. You’ll be a key leader in creating and optimizing our petabyte-scale data storage, data processing, and data querying platforms. If you’re passionate about enabling high-performance interactive analytics, real-time streaming data applications, and deploying cloud infrastructure at scale with enterprise-grade reliability, we want you to join our talented and growing team!

Requirements

  • 6+ years of experience as a Data Platform Engineer or Data Engineer or Data Infrastructure Engineer, with hands-on expertise in building and maintaining cloud-based data platforms at large scale
  • Passion for leading/contributing towards the technical vision of the team/org, strong ownership of mission critical systems, dedication to honing their craft while mentoring others
  • Experience in building and maintaining both batch and streaming data pipelines using DBT/Databricks/Apache Spark/Apache Flink, as well as deep understanding of data architecture and data modeling best practices
  • Expertise with AWS services, cloud architecture, fault-tolerant distributed data systems, and proficiency with Terraform for provisioning and managing cloud infrastructure
  • Deep understanding of modern data lakehouse architectures and ecosystem such as Kafka, Flink, Spark, Databricks, Snowflake, DBT, Airflow, Debezium, Delta/Iceberg, StarRocks, Clickhouse, and proficient with Python/Java and SQL
  • Experience building tools and frameworks to accelerate the development of data pipelines, and familiarity with data governance, data quality, and observability best practices

Nice To Haves

  • Exposure to or interest in the cryptocurrency technology ecosystem
  • Experience working with different blockchain technologies is a plus

Responsibilities

  • Help define the technical vision of the team/org, articulate how our data platform and architecture could evolve
  • Design, implement, and optimize our high-performance, scalable data serving platform that enables data querying and consumption across the organization and external facing data products
  • Design, implement, and optimize our high-performance, scalable data storage and transformation platform that enables both batch and stream processing with 100+ million updates per day on datasets > 100 billion rows
  • Build seamless integrations between Data Cloud and various relational and noSQL OLTP databases
  • Build batch and streaming data pipelines for core blockchain datasets widely used across the company
  • Deploy cloud infrastructure at scale with enterprise-grade reliability, implement and maintain infrastructure automation and self-service, and create robust CI/CD pipelines
  • Establish and maintain observability, security, and data governance solutions to ensure high quality, efficiency, and reliability of data pipelines

Benefits

  • We encourage applicants across any race, ethnicity, gender/gender expression, age, spirituality, ability, experience and more. If you need any accommodations to make our interview process more accessible to you due to a disability, don't hesitate to let us know.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service