Cloud Data Engineer

National Debt Relief, LLC.
2d$114,000 - $131,000Remote

About The Position

National Debt Relief (NDR) is seeking a Cloud Data Engineer to join our Data Engineering team. In this role, you will contribute to the orchestration, automation, and optimization of data workflows across our Snowflake-based enterprise data platform. Reporting to the Senior Data Engineering Manager, you will help maintain and extend scalable, production-ready data systems-including dbt transformation workflows, Python-based ingestion pipelines, and observability frameworks that power our analytics ecosystem. The ideal candidate has hands-on experience with SQL and dbt, Python-based data pipelines, and exposure to modern orchestration tools like Dagster. Exposure to IAC (Terraform and CI/CD practices for data systems is valuable. You will work at the intersection of data engineering and platform operations, supporting reliable, automated, and governed data workflows at scale. This role requires strong problem solving skills, clear communication with technical stakeholders, and the ability to deliver consistent results with minimal oversight.

Requirements

  • Bachelor’s degree in Computer Science, Data Engineering, or a related field preferred.
  • 4+ years of experience in analytics engineering, data engineering or data warehouse development.
  • Experience working with CI/CD workflows and automated testing for data pipelines.
  • Experience working with observability frameworks for data freshness, quality, and reliability.
  • Hands on experience with Dagster (or Airflow, with a strong desire to work in Dagster) for managing event-driven pipelines and orchestrated assets.
  • Strong proficiency in SQL and dbt for building and maintaining curated datasets and data transformation pipelines.
  • Solid expertise with Snowflake, with exposure to managing infrastructure with IaC tools (Terraform or equivalent).
  • Proficiency in Python for developing pipelines, APIs, and automation solutions.
  • Ability to work independently and deliver results with minimal oversight.
  • Clear, timely, and proactive communication, including experience collaborating with technical stakeholders.
  • Ability to manage multiple priorities and projects, ensuring progress stays visible and deliverables are met.
  • Troubleshooting and problem-solving skills, with attention to detail when working with sensitive systems and processes.
  • Strong collaboration and communication skills to partner effectively across data engineering, analytics, and product teams.
  • Self-starter who can follow and contribute to established data orchestration standards.
  • Computer competency and ability to work with a computer.
  • Prioritize multiple tasks and projects simultaneously.
  • Exceptional written and verbal communication skills.
  • Punctuality expected, ready to report to work on a consistent basis.
  • Attain and maintain high performance expectations on a monthly basis.
  • Work in a fast-paced, high-volume setting.
  • Use and navigate multiple computer systems with exceptional multi-tasking skills.
  • Remain calm and professional during difficult discussions.
  • Take constructive feedback.

Nice To Haves

  • Experience in financial services or related industries.
  • Expertise maintaining orchestration systems (Dagster, Airflow).
  • Interest in expanding skills across orchestration, infrastructure, and data platform operations.

Responsibilities

  • Contribute to the maintenance of data pipelines in our orchestration tool to improve ingestion, transformation, and data quality workflows across the enterprise data platform.
  • Develop and maintain Python-based ingestion pipelines when custom pipelines are required, integrating data from APIs and third-party systems.
  • Support Snowflake infrastructure using IaC (Terraform or similar), while adhering to Data Engineering best practices.
  • Maintain, and optimize dbt transformation workflows to support curated and trusted data models for analytics and operations.
  • Be aware of methods to optimize Snowflake performance and reduce compute spend through warehouse tuning, efficient query design, and resource utilization monitoring.
  • Respond to morning load failures to minimize impact to the business (east coast working hours).
  • Maintain robust data security and access controls within Snowflake, ensuring compliance with governance and privacy standards.
  • Maintain CI/CD workflows for data pipelines, including automated testing, deployment, and version control practices.
  • Maintain observability frameworks for data pipelines, including freshness checks, data contract enforcement, and automated alerting for anomalies.
  • Periodically review pull requests for dbt models and pipeline changes, providing feedback to maintain code quality and consistency.
  • Participate in team discussions around pipeline design, data modeling decisions, and incident triage.
  • Document system architectures, workflows, and configurations to support governance, reproducibility, and transparency.
  • Drive consistent, visible deliverables that demonstrate progress and impact, ensuring projects remain on track.

Benefits

  • Generous Medical, Dental, and Vision Benefits
  • 401(k) with Company Match
  • Paid Holidays, Volunteer Time Off, Sick Days, and Vacation
  • 12 weeks Paid Parental Leave
  • Pre-tax Transit Benefits
  • No-Cost Life Insurance Benefits
  • Voluntary Benefits Options
  • ASPCA Pet Health Insurance Discount
  • Access to your earned wages at any time before payday
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service