About The Position

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer in United States. In this role, you will be responsible for building, maintaining, and optimizing data pipelines and infrastructure that support critical business operations and analytics. You will collaborate closely with Revenue Operations, BI, and Product teams to ensure accurate, accessible, and reliable data across the organization. The position requires hands-on experience with ETL/ELT pipelines, cloud data warehouses, and API integrations, alongside strong analytical and problem-solving skills. You will help shape data architecture best practices, improve data integrity, and drive scalable solutions. The role provides exposure to cloud technologies, modern data tools, and a collaborative environment where your contributions directly impact decision-making and operational efficiency.

Requirements

  • Minimum of 2 years in a data engineering or data management role.
  • Strong SQL skills across relational databases (PostgreSQL, MySQL, T-SQL, etc.).
  • Experience with Python and PySpark for data transformation and processing.
  • ETL/ELT pipeline development and maintenance expertise.
  • Knowledge of cloud data warehousing (AWS Redshift, Azure SQL, Snowflake, GCP BigQuery, etc.).
  • Familiarity with CI/CD processes and source control (Git, GitLab, Azure DevOps, Jenkins).
  • Understanding of data architecture concepts (dimensional, relational) and data privacy/compliance.
  • Strong analytical, organizational, and problem-solving skills, with attention to detail.
  • Excellent communication and collaboration skills in a remote, cross-functional environment.

Responsibilities

  • Partner with BI Analysts, Operations, Product, and Engineering teams to define and assess data requirements.
  • Design, implement, and maintain ETL/ELT pipelines and data integrations.
  • Build and manage data architecture, including relational and dimensional databases or cloud data warehouses.
  • Monitor data infrastructure performance, ensuring high availability, reliability, and uptime.
  • Recommend improvements to data architecture, ETL processes, and workflows to increase scalability and efficiency.
  • Ensure data integrity, compliance, and alignment with organizational business objectives.
  • Collaborate on API querying, data sourcing, and cross-functional data initiatives.

Benefits

  • Competitive compensation and benefits package.
  • Fully remote working arrangement.
  • Opportunities for professional development, including Pluralsight, conferences, and certifications.
  • Supportive and inclusive work culture promoting work-life balance.
  • Exposure to cutting-edge data engineering tools and cloud technologies.
  • Collaborative, entrepreneurial environment with a focus on innovation and continuous learning.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service