Sr Data Engineer

HR Recruiting ServicesGlendale, CA
Hybrid

About The Position

We are seeking a Senior Data Engineer based in Glendale, CA, to play a pivotal role in building and maintaining a robust, scalable, and high-performance Core Data platform. This position requires a hands-on expert who thrives in a dynamic, innovation-driven environment and contributes to the development of real-time and batch data pipelines using modern cloud-native technologies. You will collaborate closely with product managers, architects, and cross-functional engineering teams to deliver reliable, high-quality data solutions that power critical business functions across Engineering, Data Science, Analytics, and Operations. Your work will directly impact the integrity, performance, and scalability of our data infrastructure, ensuring adherence to SLAs and industry best practices. The role is onsite four days per week, offering the opportunity to be deeply embedded in a collaborative, high-impact technical culture.

Requirements

  • 5+ years of data engineering experience specifically developing large-scale data pipelines Spark, Airflow, Databricks or Snowflake, SQL, Python
  • Proficiency in Python, Java, or Scala, with a strong foundation in software engineering principles
  • Hands-on experience with distributed processing systems such as Apache Spark in production
  • Proven experience with data pipeline orchestration tools, particularly Apache Airflow
  • Demonstrated experience with cloud-based MPP databases such as Snowflake, BigQuery, or Databricks
  • Experience developing APIs using GraphQL or similar technologies
  • Deep understanding of OLTP vs OLAP systems and their respective use cases
  • Strong background in distributed data processing, data service software engineering, or data modeling
  • Must pass a background check required.
  • Requires onsite presence 4 days per week

Responsibilities

  • Design, build, and maintain scalable data pipelines using Python, AWS, Spark, Databricks, and Airflow
  • Develop and manage real-time streaming data pipelines leveraging Delta Lake and other modern data streaming technologies
  • Build and maintain APIs (including GraphQL) to expose data assets to downstream applications and services
  • Collaborate with product managers, architects, and engineers to define platform requirements and drive technical execution
  • Establish and enforce internal and external standards for pipeline configuration, naming conventions, and data governance
  • Ensure operational excellence, data accuracy, and reliability across all datasets to meet SLAs and stakeholder expectations
  • Contribute to documentation and continuous improvement of data platform architecture and best practices

Benefits

  • Three 3 levels of medical insurance for you and your family
  • Dental insurance (family)
  • 401K
  • Overtime
  • California has the following sick leave policy: accrue 1 hour for every 30 hours worked up to 48 hours. If you are based in a different state, please inquire about that state’s sick leave policy.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service