Data Engineer - BELAY

BelayAtlanta, GA
1d$110,000 - $150,000Remote

About The Position

BELAY is a growing and vibrant, Atlanta, GA-based company that offers virtual staffing solutions in the areas of Virtual Assistants, Marketing Assistants, and Financial Specialists. We are a fast-paced team of high performers that work extremely hard but also know how to have a great time. Culture is a top priority and our values are lived out daily. Who is BELAY? Click here and enjoy! Below are the position requirements. Job purpose The Data Engineer will play a critical role in designing and maintaining Belay’s modern data stack. This role is responsible for building reliable, scalable data pipelines, optimizing warehouse performance, and implementing robust orchestration frameworks. The ideal candidate is highly technical, systems-minded, and experienced in Snowflake-centric environments with strong PostgreSQL and data orchestration expertise. You should be comfortable owning data infrastructure end-to-end and partnering cross-functionally with analytics, finance, and engineering teams.

Requirements

  • 5+ years of experience in data engineering or related field
  • Strong hands-on experience with Snowflake
  • Strong experience with PostgreSQL
  • Proven expertise with Apache Airflow or Dagster and workflow orchestration
  • Experience building and maintaining modern ELT/ETL pipelines
  • Advanced SQL skills and strong data modeling experience
  • Experience with orchestration and pipeline monitoring best practices
  • Familiarity with cloud data architectures (AWS, GCP, or Azure)
  • Strong problem-solving and performance tuning skills
  • Excellent communication and documentation abilities

Nice To Haves

  • Experience with dbt or similar transformation frameworks
  • Experience with real-time or streaming pipelines
  • Infrastructure-as-code experience (Terraform or similar)
  • Experience supporting finance or operational analytics teams
  • Background in high-growth or services-based companies

Responsibilities

  • Design, build, and maintain scalable ELT/ETL pipelines
  • Develop and optimize data models in Snowflake/PostgreSQL
  • Ensure data reliability, integrity, and performance across the platform
  • Implement best practices for data warehousing and pipeline efficiency
  • Build and manage workflows using Apache Airflow/Dagster
  • Design robust orchestration frameworks for batch and near-real-time pipelines
  • Monitor pipeline health and implement proactive alerting
  • Troubleshoot and resolve data pipeline failures quickly
  • Work extensively with Snowflake and PostgreSQL for source systems and operational workloads
  • Optimize queries and database performance
  • Design and maintain data ingestion patterns from Postgres to Snowflake
  • Tune Snowflake workloads for cost and performance efficiency
  • Implement partitioning, clustering, and workload management strategies
  • Continuously improve pipeline speed and reliability
  • Partner with analytics, finance, and product teams to support reporting needs
  • Translate business requirements into scalable data solutions
  • Mentor junior data engineers and promote best practices
  • Contribute to the long-term data architecture roadmap
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service