Data Engineer - Mid-Level

Varda Space IndustriesEl Segundo, CA
Onsite

About The Position

Varda is seeking a Data Engineer to join our team. In this role, you will support the development of data pipelines, storage, security, and quality that power the organization. You will learn and contribute to the current application ecosystem and data architecture. You will collaborate with team members across functions to help identify opportunities for process or application improvements. You will assist in the implementation of these solutions through the product life cycle from opportunity identification to support and sustainment. You will work cross-functionally with team members across Varda to support initiatives that enhance company processes, execution insight, and infrastructure robustness. Projects will include a wide range of desired outcomes, including cycle time reduction, cost reduction, improved decision making, risk reduction, and any other key operational efficiency. Your contributions will immediately impact functional operations and have an opportunity to contribute to Varda's overall growth and success. This role will report to the Director of Enterprise Applications. This is a full-time, onsite, exempt position located in our El Segundo headquarters.

Requirements

  • Bachelor's degree in Computer Science, Information Systems, or related fields.
  • 3+ years of experience in enterprise integration or data engineering roles (advanced degrees count towards years of experience).
  • Experience with cloud data warehouse or Lakehouse platforms (ex, Snowflake, Databricks).
  • Experience with relational (SQL) and non-relational (NoSQL) databases (ex, PostgreSQL, MySQL, MongoDB, DynamoDB)
  • Exposure to data transformation tools (ex, dbt).
  • Proficiency in SQL and at least one general-purpose programming language (ex, Python).
  • Exposure to Enterprise Systems such as MRP, CRM, ERP, etc.
  • Foundational understanding of data modeling concepts (star schema, normalization, dimensional modeling).
  • Familiarity with source control (Git) and basic CI/CD concepts.
  • Ability to write clean, documented code and a willingness to learn data quality testing practices.
  • Understanding of software architecture principles, data warehousing, and ELT workflows.
  • Capability to gather business requirements from stakeholders and take a project from initial concept to finished product.
  • Ability to work with end users to rapidly iterate on prototype applications by solving end-user issues.
  • Ability to translate technical concepts to non-technical audiences for stakeholders, internal and external to the organization.

Nice To Haves

  • Experience with database administration, including schema design, performance tuning, and query optimization.
  • Experience interfacing with engineering and manufacturing groups to understand system designs and translations to data needs.
  • Experience supporting AI/ML pipelines, including feature stores, training data pipelines, or model monitoring infrastructure.
  • Experience in a start-up or similar high-growth environment.
  • Familiarity with workflow orchestration tools (ex, Airflow).
  • Exposure to aerospace, defense, or pharmaceutical data environments.
  • Experience working with ERP systems deployed in a design development environment.
  • Exposure to Infrastructure as Code for cloud resource provisioning (e.g., Terraform).
  • Relevant platform or tooling certifications (ex, dbt Certified Developer, cloud data platform certifications).

Responsibilities

  • Assist in building and maintaining ELT pipelines that ingest data from ERP, CRM, PLM, QMS, and other enterprise systems.
  • Support enterprise data platform and enterprise application evaluations by researching external solutions and documenting findings.
  • Assist with data storage tasks, including table format standards, partitioning strategies, and interoperability across the tool ecosystem.
  • Establish data quality frameworks, lineage tracking, and pipeline observability, including SLA monitoring, proactive alerting, and production-grade logging.
  • Support AI/ML workflows by designing feature pipelines, clean data products, and a governed knowledge layer that keeps data semantically aligned for reliable model and agent consumption.
  • Work with application engineers and functional stakeholders to help translate requirements and business logic into technical specifications and data solutions.
  • Operate effectively within regulated environments where your pipelines will be subject to compliance and auditability requirements enforced by FDA GMP, ITAR, and DCAA.
  • Implement and enforce modern development practices, including source control, peer review, CI/CD pipelines, and automated build/test/deploy workflows (ex, GitHub Actions).
  • Leverage AI-assisted development tooling to accelerate delivery and improve personal productivity.
  • Learn and apply layered data architecture patterns to organize data assets for reliability and reuse.

Benefits

  • Equity in a fully funded space startup with potential for significant growth (interns excluded)
  • 401(k) matching (interns excluded)
  • Unlimited PTO (interns excluded)
  • Health insurance, including Vision and Dental
  • Lunch and snacks provided on site every day.
  • Dinners provided twice a week.
  • Maternity / Paternity leave (interns excluded)
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service