Senior Data Engineer

JanuaryNew York, NY
64d

About The Position

As January's founding Senior Data Engineer, you'll transform how we leverage data to expand access to credit — not by fixing what's broken, but by unlocking what's possible. You'll take full ownership of our modern data stack, evolving it from a capable system maintained part-time by analysts and engineers into a world-class platform that anticipates and enables our most ambitious data initiatives. You'll design the data infrastructure that helps millions achieve financial stability, ensuring every insight flows seamlessly from production to decision-makers. By establishing data engineering as a core discipline at January, you'll free our analysts to focus on insights while you architect the scalable foundation that powers our next phase of growth.

Requirements

  • 5+ years in data engineering or analytics engineering with progressive technical responsibility
  • Deep expertise with modern data warehouses (Snowflake, BigQuery, or Redshift) including performance tuning and cost optimization
  • Advanced SQL skills — you can write elegant queries and debug why that 45-minute monster is destroying our compute budget
  • Production experience with dbt or similar transformation tools, including testing and documentation best practices
  • Proven ability to build and maintain ETL/ELT pipelines at scale using modern orchestration tools
  • Track record of designing data models that balance analytical flexibility with performance at scale

Nice To Haves

  • Experience with streaming architectures and real-time analytics
  • Familiarity with ML infrastructure and feature stores
  • Knowledge of financial data privacy regulations and compliance
  • Previous startup or high-growth company experience

Responsibilities

  • Own and optimize our entire data platform — taking our Snowflake warehouse from analyst-maintained to engineer-optimized while standardizing data models for customer reporting, operational dashboards, and ML features
  • Build self-healing data pipelines — designing ETL processes that scale automatically with volume, implementing monitoring that catches issues before anyone notices, and optimizing costs without sacrificing performance
  • Democratize data access — creating intuitive models that help PMs, analysts, and ops teams find answers independently while maintaining security and compliance requirements
  • Bridge engineering and analytics — establishing feedback loops between production systems and analytical needs, ensuring schema changes don't break downstream dependencies, and influencing how new features generate data
  • Institute modern data practices — implementing testing frameworks, building CI/CD pipelines for infrastructure changes, and creating documentation that enables others to extend your work
  • Drive strategic infrastructure decisions — identifying where new tools unlock capabilities, balancing quick wins with architectural vision, and building the foundation for an eventual data engineering team
  • Deliver immediate impact through key projects including: Data Model Redesign, Pipeline Reliability, Cost Optimization, and Analytics Enablement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service