Senior Data Engineer

Nerdery
Hybrid

About The Position

Nerdery is a digital product consultancy that partners with clients to grow their business and delight customers through intuitive, thoughtfully designed technology. They solve problems creatively across strategy, design, and technology. The Senior Data Engineer role is responsible for designing and implementing data architecture, storage solutions, and analytics platforms to drive business insights. This role focuses on building scalable, well-governed data platforms on AWS and migrating legacy infrastructure to modern cloud environments.

Requirements

  • Bachelor's degree in Computer Science, Data Engineering, or a related field (or equivalent practical experience).
  • 6+ years of professional experience in data engineering or a closely related discipline.
  • In-depth knowledge of Snowflake, including warehouse design, role-based access control, performance tuning, and cost optimization.
  • Experience migrating data pipelines and infrastructure to Snowflake from other platforms (legacy warehouses, on-premise systems, or hybrid environments).
  • Advanced proficiency with dbt for modular, tested, and version-controlled data transformations.
  • Hands-on experience with Dagster for pipeline orchestration, asset management, and data observability.
  • Proven experience designing and implementing data pipelines, data storage solutions, and analytics platforms on modern cloud data stacks.
  • Advanced proficiency in Python with hands-on experience in data-processing libraries such as Pandas, Polars, PySpark, or similar.
  • Strong SQL skills across both analytical and transactional workloads.
  • Experience with relational and/or columnar databases such as PostgreSQL, SQL Server, or Snowflake.
  • Deep understanding of data modeling, ETL/ELT processes, and data warehousing principles.
  • Experience implementing data architecture patterns (e.g., medallion/multi-hop, data mesh, lakehouse) to support scalable, well-governed data platforms.
  • Familiarity with Infrastructure-as-Code tools such as Pulumi or Terraform.
  • Familiarity with event-driven architectures and integrating orchestration platforms with cloud services.
  • Strong version-control practices (Git) with experience building CI/CD workflows for data workloads (e.g., GitHub Actions).
  • Proven ability to translate business and technical requirements into production-grade, maintainable data pipelines.
  • Excellent problem-solving and analytical skills.
  • Ability to communicate clearly with both technical and non-technical stakeholders.
  • Proactive collaborator who mentors colleagues and raises the technical bar across the team.
  • Customer-Focused Execution and Communication: Excels at translating deep customer understanding into impactful work, ensuring that every project and decision delivers exceptional user value. Able to effectively explain technical decisions to non-technical stakeholders.
  • Tenacious Problem-Solving: Relentlessly unravels complex problems, developing innovative solutions to overcome any challenge that stands in the way of progress.
  • Integrity-Driven Work: Builds trust by consistently upholding high standards in all work and advocating for the right approach, ensuring quality and transparency.
  • Collaborative Impact: Actively elevates the team's capabilities by fostering a collaborative environment, sharing knowledge, and prioritizing collective success over individual credit.
  • End-to-End Ownership: Takes full accountability for an initiative's entire lifecycle, from concept to completion, ensuring the final result successfully achieves its intended goals.
  • Dedication to Craftsmanship: Driven by a passion for their craft to continuously learn, deepen their expertise, and strive for excellence in their chosen field.

Nice To Haves

  • Exposure to containerisation (Docker) is a plus.
  • Familiarity with data quality and observability tooling (e.g., dbt tests, Great Expectations, Monte Carlo) is desirable.

Responsibilities

  • Design and implement end-to-end data pipelines and ETL/ELT processes
  • Architect scalable data platforms using patterns such as medallion/multi-hop, data mesh, or lakehouse
  • Develop data-processing logic using Python (Pandas, PySpark, Polars) and advanced SQL
  • Automate data workflows using orchestration tools like Airflow/MWAA and Step Functions
  • Implement Infrastructure-as-Code (Terraform/CloudFormation) and CI/CD for data workloads

Benefits

  • Choose from two comprehensive medical plans (including an HSA-eligible option), plus high-quality dental and vision insurance.
  • We provide peace of mind by fully covering the cost of several essential plans: Basic Life and AD&D Insurance, Short-Term Disability (STD), and Long-Term Disability (LTD) coverage.
  • Take advantage of discounted, employee-paid options to protect your family, pets, and assets, including Voluntary Life and AD&D, Accident, Critical Illness, and Hospital Indemnity insurance, Pet Insurance, and plans offering Legal Support and Identity Theft Protection.
  • We offer a 401(k) plan with a company match of up to 3.5% with immediate vesting.
  • Nerds enjoy flexible time off with Flex PTO that can be used for vacation, personal time, personal illness, or time off to care for dependents.
  • Nerdery is a remote-friendly workplace. Our Minnesota office space is available for Nerds who prefer an in-office environment or collaborative setting.
  • All Nerds have dedicated resources and access to funds to pursue professional development, attend industry conferences, and obtain certifications.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service