Senior Data Engineer ID65436

AgileEngineDowney, CA
Hybrid

About The Position

We are looking for a Senior Data Engineer to design and maintain scalable data pipelines and platforms supporting Accounting, Finance, Payments, and Tax functions within a large-scale financial systems platform. You will build ETL/ELT workflows using Python, dbt, and Airflow, working across Snowflake, Redshift, Kafka, and Spark to ensure data availability and reliability for critical business operations. The role operates in a cross-functional Agile environment with direct stakeholder engagement across Engineering, Product, and Analytics teams.

Requirements

  • 5+ years of professional experience in Data Engineering or Data Warehousing roles
  • Strong programming skills in Python
  • 5+ years of experience building data pipelines using ETL/ELT tools, with dbt preferred
  • 3+ years of hands-on experience with big data technologies such as Snowflake, Redshift, Kafka, Spark, or Hive
  • Extensive experience working with both SQL and NoSQL databases
  • Strong expertise with workflow orchestration and pipeline management tools such as Airflow
  • Strong understanding of scalable data architecture and engineering best practices
  • Excellent communication and stakeholder management skills
  • Proven ability to manage competing priorities and deliver within agreed sprint commitments
  • Comfortable working in highly collaborative, cross-functional Agile teams
  • Self-starter mindset with strong analytical, problem-solving, and critical-thinking abilities
  • Master’s degree in Computer Science, Mathematics, Statistics, or a related technical field preferred, or Bachelor’s degree with relevant experience
  • Upper-intermediate English level

Nice To Haves

  • Experience working in cloud-based data environments
  • Familiarity with modern data warehousing and distributed systems concepts
  • Exposure to data governance, observability, or data quality frameworks

Responsibilities

  • Design, develop, and maintain scalable and reliable ETL/ELT pipelines
  • Build and optimize data workflows using tools such as dbt and Airflow
  • Develop and maintain data solutions leveraging technologies such as Snowflake, Redshift, Kafka, Spark, and Hive
  • Work with both SQL and NoSQL databases to support data ingestion, transformation, and analytics needs
  • Monitor, troubleshoot, and improve data pipeline performance, scalability, and reliability
  • Collaborate closely with cross-functional teams including Engineering, Product, Analytics, and Business stakeholders
  • Participate in Agile ceremonies and contribute to sprint planning and delivery commitments
  • Manage multiple priorities effectively while ensuring timely and high-quality deliverables
  • Contribute to data architecture discussions and help drive engineering best practices

Benefits

  • Mentorship
  • TechTalks
  • Personalized growth roadmaps
  • Competitive compensation
  • USD-based pay
  • Education budget
  • Fitness budget
  • Team activity budgets
  • Flexible schedule
  • Remote options
  • Office options
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service