Lower-posted 2 months ago
Full-time • Senior
Columbus, OH
1,001-5,000 employees

We are seeking a Senior Data Engineer to play a key role in building and optimizing our data infrastructure to support business insights and decision-making. In this role, you will design and enhance denormalized analytics tables in Snowflake, build scalable ETL pipelines, and ensure data from diverse sources is transformed into accurate, reliable, and accessible formats. You will collaborate with business and sales stakeholders to gather requirements, partner with developers to ensure critical data is captured at the application level, and optimize existing frameworks for performance and integrity. This role also includes creating robust testing frameworks and documentation to ensure quality and consistency across data pipelines.

  • Design, develop, and optimize high-performance ETL/ELT pipelines using Python, dbt, and Snowflake.
  • Build and manage real-time ingestion pipelines leveraging AWS Lambda and CDC systems.
  • Develop scalable serverless solutions with AWS, adopting event-driven architecture patterns.
  • Manage containerized applications using Docker and infrastructure as code via GitHub Actions.
  • Create sophisticated, multi-layered Snowflake data models optimized for scalability, flexibility, and performance.
  • Integrate and manage APIs for Salesforce, Braze, and various financial systems, emphasizing robust error handling and reliability.
  • Implement robust testing frameworks, data lineage tracking, monitoring, and alerting.
  • Enhance and manage CI/CD pipelines, drive migration to modern orchestration tools (e.g., Dagster, Airflow), and manage multi-environment deployments.
  • 5+ years of data engineering experience, ideally with cloud-native architectures.
  • Expert-level Python skills, particularly with pandas, SQLAlchemy, and asynchronous processing.
  • Advanced SQL and Snowflake expertise, including stored procedures, external stages, performance tuning, and complex query optimization.
  • Strong proficiency with dbt, including macro development, testing, and automated deployments.
  • Production-grade Pipeline Experience specifically with Lambda, S3, API Gateway, and IAM.
  • Proven experience with REST APIs, authentication patterns, and handling complex data integrations.
  • Background in financial services or fintech, particularly loan processing, customer onboarding, or compliance.
  • Experience with real-time streaming platforms like Kafka or Kinesis.
  • Familiarity with Infrastructure as Code tools (Terraform, CloudFormation).
  • Knowledge of BI and data visualization tools (Tableau, Looker, Domo).
  • Container orchestration experience (ECS, Kubernetes).
  • Understanding of data lake architectures and Delta Lake.
  • Competitive salary and comprehensive benefits (healthcare, dental, vision, 401k match)
  • Hybrid work environment (primarily remote, with two days a week in downtown Columbus Ohio)
  • Professional growth opportunities and internal promotion pathways
  • Collaborative, mission-driven culture recognized as a local and national 'best place to work'
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service