Senior Data Engineer

LambdaSan Francisco, CA
Hybrid

About The Position

Join our team as a Senior Data Engineer, where you will collaborate with a small high-performing team of Analytics Engineers and Data Engineers to design data pipelines, a data warehouse, an automated data catalog and internal tools. Your output will directly power product analytics, marketing, supply chain, investor relations, and many more. You will have the opportunity to design and implement best practices while working directly with stakeholders. This position reports to the Data Platform Manager.

Requirements

  • A technical bachelor’s degree (computer science, information technology, analytics, or similar) that equips you to intersect business intuition with technical skills.
  • 5+ years of experience in Data Engineer, Software Engineer, Analytics Engineer or Sr Data Analyst roles.
  • Interest in being part of a small super-versatile data team that does data engineering, data analysis, and data science at executive, system and project levels.
  • Willingness to roll up your sleeves and do architecture or data janitorial work that can have significant business impact.
  • Highly proficient in SQL and experience working with large-scale relational databases (e.g., PostgreSQL, MySQL, etc.).
  • Extensive experience with cloud data platforms (AWS, GCP, Azure) and related services such as S3, IAM, Redshift, BigQuery, etc.
  • Intermediate to advanced experience in Python or similar programming languages.
  • Experience with data orchestration tools like Apache Airflow, Prefect, or similar.
  • Knowledgeable in data modeling, database design, and performance optimization techniques.
  • A working understanding of data security, privacy, and compliance requirements.
  • Proven ability to work in a fast-paced, agile environment and manage multiple projects simultaneously.

Nice To Haves

  • Experience with data modeling in DBT (Data Build Tool)
  • Knowledge of data visualization tools and BI platforms (e.g., Tableau, Looker, Power BI).
  • Experience with CICD build tools (gitlab CICD, github actions, jenkins or similar)
  • General cloud computing proficiency, such as knowledge of compute resources, databases, disk, networking, logging, alerting etc as it pertains to clouds, such as AWS and GCP
  • Experience in iPaaS tools (e.g. Workato, Zapier, Celigo or similar)
  • Master’s degree in a technical or analytical discipline or business discipline that supplements a technical undergraduate degree
  • Experience with Master Data Management (MDM) practices
  • Experience with IaC tools (Terraform, or similar)
  • Experience with containerization and orchestration tools (Docker, Kubernetes).
  • Familiarity with bash. Ability to navigate, and perform basic functions within a Unix shell such as terminal

Responsibilities

  • Design, build, and maintain scalable, robust, and efficient data pipelines (ELT) to process large volumes of structured and unstructured data from multiple sources.
  • Use your familiarity with areas such as GTM (Salesforce, Marketing, CRM), Finance and Product to support stakeholder requests and design internal tools that support business processes.
  • Work with product engineers and system admins to design, document, and build data flows and infrastructure. You will also be a technical subject matter expert and recommend patterns to fix data gaps and ensure data quality.
  • Design and manage data warehouses and data lakes, ensuring optimal storage, governance, cataloging, data lifecycle and retrieval performance.
  • Ensure the accuracy and consistency of data across multiple systems, performing regular data quality assessments. Implement testing, CICD and IT General Controls to ensure security and reliability of data flows.
  • Work closely with business leaders, analysts, and other stakeholders to understand data requirements and deliver appropriate solutions. This will include participating early in product-design discussions, and staying involved with teams across the company.
  • Automate manual data processes and continuously optimize data workflows for performance, cost efficiency, and scalability.
  • Stay up-to-date with the latest industry trends, tools, and technologies to improve data engineering practices and infrastructure.
  • Provide technical leadership and mentorship to junior data engineers, analysts, and business users, sharing best practices and fostering a collaborative team environment.

Benefits

  • Generous cash & equity compensation
  • Health, dental, and vision coverage for you and your dependents
  • Wellness and commuter stipends for select roles
  • 401k Plan with 2% company match (USA employees)
  • Flexible paid time off plan that we all actually use
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service