Senior Engineer, Data Management (Remote)

Jobgether
69d$121,000 - $143,000

About The Position

We are seeking a highly skilled Senior Data Management Engineer to lead the design, development, and automation of enterprise-scale data pipelines. In this role, you will collaborate closely with engineering, architecture, and analytics teams to ensure efficient data ingestion, transformation, and quality management. You will work on modern cloud platforms, including Azure and Databricks, and support operationalizing AI/ML solutions. This position provides the opportunity to influence data strategy, implement best practices, and mentor team members while contributing to a fast-paced, Agile environment.

Requirements

  • 5+ years in Enterprise Data Management or Data Engineering roles.
  • Hands-on experience in building metadata-driven pipelines using Azure Data Factory and Databricks/Spark for cloud data lakes.
  • Advanced SQL skills and experience in Python/PySpark for data wrangling and analysis.
  • Experience with large-scale databases such as Snowflake, Netezza, Oracle, SQL Server, MySQL, or Teradata.
  • Familiarity with DevOps practices, version control tools (Azure DevOps, GitLab), and multi-developer environments.
  • Knowledge of traditional ETL platforms like Informatica, Datastage, Pentaho, or Ab Initio.
  • Understanding of containerization (Docker, Kubernetes) and cloud automation tools (Terraform, Azure CLI, PowerShell).
  • Experience with BI tools such as PowerBI, Tableau, or OBIEE.
  • Strong communication, collaboration, and problem-solving skills.
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or related field.

Nice To Haves

  • Personal attributes: self-starter, collaborative, curious, highly motivated, and team-oriented.

Responsibilities

  • Collaborate with engineering and enterprise architecture teams to define standards, design patterns, and CI/CD automation practices.
  • Build, automate, and maintain data ingestion, transformation, and aggregation pipelines using Azure Data Factory, Databricks/Spark, Snowflake, Kafka, and scheduling tools like Control-M or CA Workload Automation.
  • Implement metadata-driven approaches to promote self-service pipelines and reusable data frameworks.
  • Design and maintain data quality testing, audit, and monitoring frameworks.
  • Conduct complex data analysis to support business queries and reporting tools such as PowerBI, Tableau, or Looker.
  • Document data flow diagrams, data models, technical mappings, and production support processes.
  • Support Data Literacy programs by mentoring team members and training business users.
  • Ensure compliance with data security, privacy, and governance best practices.

Benefits

  • Competitive annual salary: $121,000 - $143,000
  • Flexible work arrangements, including remote and hybrid schedules
  • Paid time off, wellness days, and paid volunteer day
  • Comprehensive medical, dental, and vision insurance
  • Life and disability insurance, 401(k) plan with company match
  • Paid parental and adoption leave, fertility and adoption benefits
  • Mental health and wellness support through apps and programs
  • Mobile stipend and merchandise discounts across brands
  • Career development and internal mobility opportunities
  • Access to Associate Resource Groups and a collaborative global team
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service