Senior Data Engineer - Full Stack

Codvo.aiNew York, NY
Hybrid

About The Position

We are seeking a highly skilled Senior Data Engineer – Full Stack to build and maintain internal tools, automation frameworks, and workflows that enhance the efficiency, reliability, and scalability of our data and machine learning platforms. This role will work closely with Data Engineers, Data Scientists, and ML Engineers to streamline operations across the data lifecycle.

Requirements

  • Strong experience in Python and scripting for automation and backend development
  • Hands-on experience with Databricks platform and ecosystem
  • Experience with APIs, Terraform, and/or Databricks SDK for automation
  • Solid understanding of ETL/ELT pipelines and data platform architecture
  • Experience building testing frameworks for data pipelines and ML workflows
  • Familiarity with CLI tool development and system automation
  • Knowledge of MLOps principles and practices
  • Experience with modern development practices, including: Spec-driven development, Use of coding agents or automation-assisted development tools, Version control and CI/CD pipelines

Nice To Haves

  • Experience building dashboards or internal tools using React, Streamlit, or similar frameworks
  • Familiarity with Databricks AI/BI or other data visualization tools
  • Exposure to data governance and metadata management frameworks
  • Experience working with cloud platforms (AWS preferred)

Responsibilities

  • Design and develop CLI tools, scripts, and internal utilities to automate repetitive tasks across the data platform, including: Pipeline execution and orchestration, Data governance workflows, Metadata synchronization, Environment setup and configuration, Test harness development
  • Automate workflows on Databricks, including: Job deployment and scheduling, Environment provisioning, MLOps processes using APIs, Terraform, or Databricks SDK
  • Build and implement robust testing frameworks: Integration testing for pipelines, End-to-end validation of ETL/ELT workflows, Testing and validation for ML inference workflows
  • Improve overall productivity, scalability, and reliability of the data and ML engineering ecosystem
  • Develop lightweight internal tools and dashboards using frameworks such as React, Streamlit, or similar technologies to: Visualize data pipelines and workflows, Demonstrate model inference capabilities, Provide configuration and operational controls, Enable internal productivity monitoring and dashboards
  • Collaborate with cross-functional teams to identify automation opportunities and implement best practices
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service