Senior Data Engineer

ANRGI TECHSanta Monica, CA
14d

About The Position

The Sr. Data Engineer will contribute to the Company’s success by partnering with business, analytics and infrastructure teams to design and build data pipelines to facilitate measuring various KPIs & metrics. Collaborating across disciplines, they will identify internal / external data sources, design table structures, define ETL strategy & implement automated Data Quality checks.

Requirements

  • Data Engineering skills using - Complex or Advanced SQL queries, Python, Spark, Snowflake, Databricks, Airflow or Prefect
  • Experience with at least one cloud platform (AWS / Azure / GCP)
  • 5+ years of relevant data engineering experience.
  • Strong understanding of data modeling principles including Dimensional modeling, data normalization principles.
  • Good understanding of SQL Engines and able to conduct advanced performance tuning.
  • Ability to think strategically, analyze and interpret market and consumer information.
  • Strong communication skills – written and verbal presentations.
  • Excellent conceptual and analytical reasoning competencies.
  • Comfortable working in a fast-paced and highly collaborative environment.
  • Familiarity with Agile Scrum principles and ceremonies

Nice To Haves

  • Experience working with Kafka
  • Familiarity with tools like Datorama / Improvado / FiveTran for integrating, harmonizing, and visualizing data across platforms.
  • Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions, etc.) & Docker containers
  • Exposure to monitoring tools like Datadog
  • Bachelor’s/Master’s degree in Computer Science, Engineering, or related field.
  • Strong analytical and problem-solving skills.
  • Effective communication and teamwork abilities.

Responsibilities

  • Partner with technical and non-technical colleagues to understand data and reporting requirements
  • Work with engineering teams to collect required data from internal and external systems
  • Design table structures and define ETL pipelines to build performant data solutions that are reliable and scalable in a fast growing data ecosystem
  • Develop automated data quality checks
  • Develop and maintain ETL routines using ETL and orchestration tools such as Airflow
  • Implement database deployments using tools like SchemaChange
  • Perform ad hoc analysis as necessary.
  • Perform SQL and ETL tuning as necessary.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service