Tech Lead, Data Engineering

Zip Co Limited
16hRemote

About The Position

The Tech Lead-Data Engineering is a senior technical owner responsible for the architecture, reliability, and evolution of Zip’s core data platforms powering financial operations, risk, analytics, and AI. This role owns technical strategy across one or more high-impact data domains and delivers outcomes through hands-on execution, architectural leadership, and operational rigor. You will design and operate mission-critical batch and real-time data systems with strong guarantees around correctness, timeliness, availability, security, and cost, while raising the technical bar across the data engineering organization.

Requirements

  • 10+ years building and operating large-scale, distributed data platforms, ideally in fintech or regulated environments.
  • Deep, hands-on expertise with: Snowflake, Databricks, Airflow (Astronomer), Python, SQL Azure cloud infrastructure Kafka / Event Hub, batch and streaming architectures Airbyte, CDC, and modern ingestion patterns Open table formats (Iceberg, Delta) Analytics and governance tooling (Atlan, Tableau, Power BI)
  • Proven ability to design systems with strong correctness, availability, and observability guarantees.
  • Strong experience with DevOps, CI/CD, Infrastructure-as-Code, and automation.
  • Demonstrated success leading complex, multi-year data platform initiatives and influencing senior technical and business stakeholders.

Responsibilities

  • Own and evolve Zip’s batch, streaming, and lakehouse platforms, powering trusted data for AI, risk, and financial systems.
  • Architect and deliver scalable, secure, and governed data products that unlock speed, accuracy, and insight across the business.
  • Own the technical direction and long-term architecture of Zip’s data platforms across Data Lake, Lakehouse, and Streaming systems.
  • Make and document architectural decisions across Snowflake, Databricks, Azure, and open table formats (Iceberg, Delta, Delta Live Tables), balancing scalability, governance, correctness, and unit economics.
  • Define reference architectures for ingestion, transformation, storage, and consumption, with clear contracts and ownership boundaries.
  • Establish standards for ETL/ELT, data modeling (Medallion, Data Vault, Kimball), schema evolution, data contracts, and access patterns using Airflow (Astronomer), Python, SQL, and cloud-native tooling.
  • Define and enforce transformation standards using dbt Core / dbt Cloud, including modular modeling, testing, documentation, and CI-integrated deployments.
  • Design and operate high-throughput batch and streaming pipelines using Kafka / Azure Event Hub, including CDC and event-driven architectures.
  • Own ingestion reliability and scalability using Airbyte and custom frameworks, ensuring data completeness and correctness.
  • Ensure platforms meet fintech-grade non-functional requirements, including security, privacy, compliance, resiliency, and cost controls.
  • Set the standard for production-grade data systems: deterministic, testable, fault-tolerant, observable, and recoverable.
  • Own engineering quality across design reviews, code standards, testing strategies, and data validation frameworks.
  • Lead investigation and resolution of complex, cross-platform incidents impacting financial reporting, risk, or customer-facing analytics.
  • Define and enforce SLAs/SLOs, incident response playbooks, and postmortem practices.
  • Drive Monitoring-as-Code and deep observability across pipelines, storage, and compute, including freshness, volume, and quality metrics.
  • Build trusted, well-modeled data foundations supporting financial reporting, risk modeling, BI, AI/ML, vector search, and NLQ use cases.
  • Ensure analytical and AI workloads operate safely and predictably without compromising data integrity or platform stability.
  • Apply data governance by design, including ownership, lineage, access controls, and data quality enforcement to meet regulatory and audit requirements.
  • Design and maintain a centralized semantic layer to ensure consistent metrics, trusted insights, and self-service analytics across BI, experimentation, and AI.
  • Partner with Risk, Legal, and Governance teams to ensure Responsible AI, auditability, and compliance are enforced by design.
  • Translate ambiguous, high-impact initiatives into clear technical strategies and delivery plans.
  • Drive a data product mindset, defining ownership, SLAs, contracts, and documentation for domain-aligned data products.
  • Anticipate scaling limits, failure modes, and cost inflection points, making explicit trade-offs grounded in strong economic thinking.
  • Proactively reduce technical debt and operational risk, prioritizing work that compounds platform leverage.
  • Delegate ownership while maintaining architectural coherence and high engineering standards.
  • Act as a technical multiplier, mentoring senior engineers and accelerating organizational maturity.
  • Lead technical hiring, onboarding, design forums, and knowledge-sharing practices.
  • Influence cross-functional partners across Product, Analytics, Risk, Finance, and Engineering through clear technical communication and data-driven decision-making.

Benefits

  • Flexible working culture
  • Incentive programs
  • Unlimited PTO
  • Generous paid parental leave
  • Leading family support policies
  • Company-sponsored 401k match
  • Learning and wellness subscription stipend
  • Beautiful Union Square office with a casual dress code
  • Industry-leading, employer-sponsored insurance for you and your dependents, with several 100% Zip-covered choices available

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

501-1,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service