Data Engineer - Data Platform

Jobgether
8d$96,000 - $192,000Remote

About The Position

This role is ideal for a skilled data professional who thrives on designing, building, and maintaining scalable data infrastructure in a fast-paced, high-growth environment. You will contribute to the architecture of a modern data ecosystem, enabling cross-functional teams to access reliable, consistent, and timely data for analytics, machine learning, and product innovation. The position combines hands-on data engineering with strategic input on data modeling, pipeline optimization, and platform reliability. Operating in a fully remote setting, you will work with cutting-edge technologies to support near real-time data processing and deliver high-quality, auditable datasets. Your contributions will directly influence the organization’s data-driven decision-making and operational efficiency.

Requirements

  • 5+ years of experience as a Data Engineer, Data Warehouse Engineer, or related role.
  • Strong proficiency in Python or Scala, with additional programming language experience considered a plus.
  • Expertise in big data technologies such as Apache Spark, PySpark, or similar frameworks.
  • Hands-on experience with data-lake and data-warehousing solutions and data modeling best practices (e.g., Presto, Athena, Glue).
  • Proven ability to build robust ETL pipelines using workflow orchestration tools like Airflow.
  • Excellent SQL and data manipulation skills, with experience handling high volumes and velocity of datasets.
  • Experience gathering business requirements for data sourcing and translating them into technical solutions.
  • Strong problem-solving, analytical, and communication skills in a remote, collaborative environment.

Nice To Haves

  • experience with streaming data platforms, such as Kafka or Apache Flink.

Responsibilities

  • Build, maintain, and optimize scalable ETL pipelines for internal and external data sources.
  • Ensure data quality, reliability, and auditable workflows across the data platform.
  • Design, deploy, and maintain distributed data stores serving as central sources of truth.
  • Collaborate with internal stakeholders to understand data requirements and translate them into technical solutions.
  • Develop and configure self-service tools for data extraction, analysis, and reporting.
  • Evaluate and prototype new technologies to improve data engineering processes and platform capabilities.
  • Support real-time and batch data processing, including integration of streaming technologies like Kafka or Apache Flink.

Benefits

  • Competitive salary: $96,000 – $192,000, with potential bonus, equity, and wellness allowances.
  • Fully remote work with flexible scheduling.
  • Opportunity to work with large-scale datasets and modern data technologies.
  • Exposure to data systems supporting analytics, machine learning, and product innovation.
  • Professional growth in a fast-paced, technology-driven environment.
  • Inclusive and diverse culture that values contribution, merit, and innovation.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

11-50 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service