Senior Data Engineer

Comfrt
1d$160,000 - $180,000Remote

About The Position

We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team. The ideal candidate will be responsible for designing, developing, and optimizing our data pipeline architecture to support our analytics, machine learning, and business intelligence initiatives. This role requires a strong background in big data technologies, cloud infrastructure, and a passion for building robust, scalable, and efficient data solutions.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
  • 5+ years of professional experience in data engineering, software engineering, or a related role focused on data infrastructure.
  • Proven experience designing and building production-grade, highly reliable, and scalable data pipelines.
  • Expert proficiency in SQL and at least one high-level programming language (e.g., Python, Scala, Java).
  • Deep expertise with major cloud platforms (AWS, GCP, or Azure), specifically their data and storage services (e.g., S3, Google Cloud Storage, Azure Data Lake Storage).
  • Solid experience with modern data warehousing solutions (e.g., Snowflake, Google BigQuery, Amazon Redshift).
  • Experience with workflow orchestration tools (e.g., Apache Airflow, Dagster).
  • Familiarity with distributed data processing frameworks (e.g., Apache Spark, Dask).
  • Experience with version control systems (e.g., Git).
  • Experience with AI tools(Claude, Vertex AI, Gemini)

Nice To Haves

  • Experience with stream processing technologies (e.g., Apache Kafka, Kinesis, Pub/Sub).
  • Exposure to GenAI Integration with development process
  • Knowledge of data modeling techniques (e.g., 3NF, Dimensional Modeling).
  • Familiarity with machine learning pipelines (MLOps).
  • Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation).
  • Experience/Familiarity with D2C/E-Commerce Domain

Responsibilities

  • Design, construct, install, test, and maintain highly scalable data management systems and processing pipelines using cloud-native services (e.g., AWS, GCP, Azure).
  • Develop and optimize ETL/ELT processes to ingest, transform, and load data from various internal and external sources into our data warehouse/data lake.
  • Implement data governance, security, and quality controls across all data pipelines.
  • Ensure data architecture supports the needs of data scientists, analysts, and other business stakeholders.
  • Monitor, tune, and optimize data warehouse performance (e.g., Snowflake, BigQuery, Redshift).
  • Troubleshoot and resolve complex data-related issues and performance bottlenecks in the data platform.
  • Drive continuous improvement in data platform reliability, efficiency, and cost management.
  • Collaborate closely with data scientists, software engineers, and product managers to understand data requirements and deliver solutions.
  • Define and enforce best practices for data engineering, including coding standards, documentation, and operational procedures.
  • Mentor and guide junior data engineers, fostering a culture of technical excellence and continuous learning.

Benefits

  • generous paid time off
  • company-covered health insurance
  • 5% 401k match
  • discounts on all Comfrt products!
  • flexibility and collaborative support of a fully remote environment
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service