Data Engineer

The National Society of Leadership and Success (NSLS)
1d

About The Position

The Data Engineer will join a team of 100+ purpose-driven staff members in a friendly, focused, fast-paced entrepreneurial environment. The National Society of Leadership and Success (NSLS) is the largest accredited leadership honor society in the United States, with over 800 chapters and more than 2 million members. The Data Engineer will lead the transformation of our analytics infrastructure. You'll play a critical role in migrating from a legacy Apache Hop and Redshift system to a modern Snowflake and dbt Cloud stack, establishing a single source of truth for our business stakeholders, and building maintainable pipelines that will serve as the foundation for NSLS's data platform for years to come. This is a rare opportunity to join at an inflection point where you can make an outsized impact. You'll replace unnecessarily complex "spaghetti code" with clean, modular systems and implement best practices that will set the standard for future engineers. If you enjoy working in dbt, value readable and maintainable code, and want to build something new while leveraging modern tools, this role is for you.

Requirements

  • 2-5 years of experience as a Data Engineer, Analytics Engineer, or similar role
  • Expert SQL skills: You write advanced, readable SQL including CTEs, window functions, and query optimization techniques
  • dbt proficiency: You've built and maintained production dbt projects and understand modeling best practices (this is critical for the role)
  • Python fundamentals: Comfortable working with dataframes, querying APIs, and writing scripts for data processing
  • Modern data stack familiarity: Experience with Snowflake, AWS, and orchestration tools like dbt Cloud or Airflow
  • Code craftsmanship: You write clean, modular, well-documented code that others can easily understand and maintain
  • AI-assisted development: Proficient with AI coding tools (Claude Code, GitHub Copilot, Cursor, or similar) to accelerate development, debug efficiently, and learn new technologies quickly

Nice To Haves

  • Experience with reverse ETL tools (Hightouch, Census, etc.)
  • Familiarity with Fivetran or similar ingestion platforms
  • Background in batch processing and API integrations
  • Experience with Hex, PostHog, or similar analytics/CDP tools

Responsibilities

  • Contribute to the Snowflake migration: Work with the broader data team to deprecate our legacy Redshift and Apache Hop infrastructure by completing the migration to Snowflake and dbt Cloud
  • Build production-grade dbt pipelines: Develop and maintain SQL transformations in dbt that power analytics for business stakeholders across the organization
  • Establish data architecture: Refine and maintain our medallion architecture (bronze, silver, and gold layers) to create clear separation of concerns and a single source of truth
  • Collaborate with the data team: Partner with our AWS Engineer, Analytics Engineer, and Business Analyst in a sprint-based workflow with code reviews and regular standups
  • Maintain dbt transformations (60-70% of role): Own approximately 15-20 core models and summary views that drive business reporting, ensuring they're performant, well-documented, and easy to maintain
  • Build new data pipelines: Ingest data from APIs and new sources as business needs evolve
  • Enable reverse ETL: Develop and manage batch processes to send transformed data to downstream services like HubSpot using tools like Hightouch
  • Monitor data quality: Implement testing and alerting to catch issues before they impact stakeholders
  • Contribute to technical standards: Help establish and maintain best practices for code quality, documentation, and data modeling
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service