Senior Data Engineer

Kroll
Remote

About The Position

We are seeking a high-performing Senior Data Engineer to help build and scale the data infrastructure that powers analytics, automation, and AI across the firm. This role is for candidates who want real engineering responsibility, not shadow work. You will design and implement production-grade data pipelines, work with cloud-native tooling, and partner with senior engineers and data scientists on systems that matter.

Requirements

  • Bachelor’s or master’s degree in computer science, engineering, or a related field
  • 5+ years of proven experience in data engineering, delivering business-critical software solutions for large enterprises with a consistent track record of success
  • Experience writing ETL/ELT jobs
  • Experience with Azure and Databricks Platform
  • Experience with Python, SQL, and REST APIs
  • Excellent communication and the ability to reason about trade-offs
  • Ability to work with an international team
  • Cloud architecture principles: compute, storage, networks, security, cost
  • Proficiency in using open-source tools, frameworks like FastAPI, Pydantic, Polars, Pandas, Delta Lake, Docker, Kubernetes
  • Knowledge of CI/CD, Git, or infrastructure-as-code concepts
  • Strong project management skills, with the ability to prioritize tasks and manage multiple projects simultaneously in an Agile environment
  • Understanding of how data engineering feeds into Business Intelligence and reporting tools (Power BI/Tableau)
  • Strong problem-solving and analytical skills
  • Strategic thinker and strong execution orientation
  • Ability to work in cross-functional teams
  • Attention to detail and data quality

Responsibilities

  • Design and build scalable organizational data infrastructure and Medallion architecture within a Lakehouse environment
  • Develop robust, fault-tolerant ETL/ELT applications for seamless data ingestion, transformation, and distribution to enable analytics, reporting, and AI workloads
  • Work with different stakeholders and teams to assist with data related technical solutions and support their data infrastructure needs
  • Explore and experiment with new use cases, frameworks, and tools to enhance AI capabilities, ensuring data integrity, quality, and reliability
  • Identify and implement infrastructure re-designs to improve scalability, optimize data delivery, and automate manual workflows
  • Choose the best tools/services/resources to build robust data pipelines
  • Collaborate with cross-functional teams to understand data requirements, create robust data models, and deliver actionable insights
  • Monitor, troubleshoot, and optimize jobs for performance, addressing data pipeline bottlenecks and ensuring cost efficiency
  • Continuously improve engineering processes, balancing speed, quality, and business impact
  • Coach, mentor, and provide technical guidance to junior engineers, fostering a culture of continuous learning and development
  • Stay updated on emerging technologies and trends in data engineering, recommending and implementing innovative solutions

Benefits

  • Comprehensive medical, dental, and vision plans
  • Generous paid time off (PTO), paid company holidays, generous parental and family leave
  • Life insurance, short- and long-term disability coverage, and accident protection
  • Competitive salary structures, performance-based incentives, and merit-based compensation reviews
  • 401(k) plans with company matching
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service