Senior Data Engineer

HDROmaha, NE
11h

About The Position

At HDR, our employee-owners are fully engaged in creating a welcoming environment where each of us is valued and respected, a place where everyone is empowered to bring their authentic selves and novel ideas to work every day. As we foster a culture of inclusion throughout our company and within our communities, we constantly ask ourselves: What is our impact on the world? Watch Our Story:' https://www.hdrinc.com/our-story' Each and every role throughout our organization makes a difference in our ability to change the world for the better. Read further to learn how you could help make great things possible not only in your community, but around the world. Position Overview The Senior Data Engineer is responsible for designing, building, and optimizing scalable data solutions that support analytics, operations, and enterprise decision-making. This role owns full-stack data engineering—from ingestion through transformation, modeling, orchestration, and delivery—leveraging Snowflake, SQL, Python, Azure, and Terraform. The Senior Data Engineer also mentors junior engineers, contributes to platform strategy, and ensures best practices in engineering, security, and DevOps.

Requirements

  • Bachelor's degree in Computer Science, Management Information Systems, related technical area, or the relevant combination of education and relevant experience
  • 3-5 years of experience in administering systems-related technologies
  • Excellent client service and interpersonal skills
  • Ability to analyze and design enterprise-wide infrastructure technology systems
  • Ability to communicate with users and information technology professionals
  • Ability to work with vendors to request service and work through defective product issues
  • Self-starter with the ability to handle multiple tasks and deadlines with minimal supervision
  • Attitude and commitment to being an active participant of our employee-owned culture

Nice To Haves

  • Minimum 5 years of experience in data engineering.
  • Advanced proficiency in SQL, Python, Terraform, and data modeling.
  • Deep experience with Snowflake and Azure cloud environments.
  • Strong experience with transformation tools.
  • Experience with streaming architectures such as Kafka or Azure EventHub.
  • Strong understanding of CI/CD, DevOps principles, and Git-based workflows.
  • Excellent problem-solving, communication, and leadership skills.
  • Experience with orchestration frameworks such as Airflow, Dagster, or Prefect.
  • Experience with containerization technologies including Docker or Kubernetes.

Responsibilities

  • Architect, build, and maintain scalable, resilient data pipelines across batch and streaming workloads.
  • Develop end-to-end data pipelines in our Snowflake data platform using technologies like Azure Data Factory, Oracle Integration Cloud, Terraform, Azure EventHub or equivalent services.
  • Implement data transformations, data models, and testing frameworks using tools like Coalesce or Data Build Tool (DBT).
  • Model structured and semi-structured data for analytical, operational, and downstream consumption.
  • Optimize Snowflake performance including warehouse sizing, clustering, micro-partitioning, and query optimization.
  • Build secure, scalable Snowflake schemas, tasks, pipes, and streams using Infrastructure as Code best practices.
  • Automate infrastructure provisioning using Terraform.
  • Build CI/CD pipelines for data workflows using GitHub Actions, Azure DevOps, or similar tools.
  • Ensure strong engineering standards across versioning, testing, and automation.
  • Lead adoption of orchestration tools such as Airflow, Dagster, or ADF native orchestration.
  • Develop monitoring, alerting, and data quality frameworks across pipelines.
  • Improve system reliability, error recovery, and operational visibility.
  • Mentor junior engineers and review code for quality, performance, and maintainability.
  • Partner with analytics, engineering, and product teams to deliver high-impact solutions.
  • Influence data platform strategy and long-term architecture decisions.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service