Data Engineer I, Data Products

. Crane Worldwide Logistics .Houston, TX
10h

About The Position

ESSENTIAL JOB FUNCTIONS Data Pipeline & Platform Development Build and maintain pipelines using dbt, Prefect, and Terraform Develop and manage connectors across sources and targets including Kafka, RDBMs, and Snowflake. Implement schema evolution, validation rules, and automated testing Support high-availability and disaster recovery design for Snowflake and Materialize Data Product Engineering Author and review schemas and data contracts for consistency and governance Develop and optimize dbt models for Snowflake and Materialize analytics layers Configure clusters and role-based access for shared environments Document datasets to ensure discoverability and proper usage across teams Stakeholder Collaboration Partner with BI developers, analysts, and business teams to deliver datasets that support reporting, dashboards, and integrations Investigate and resolve data issues, ensuring durable fixes Participate in design reviews to align technical solutions with business requirements Collaboration & Standards Contribute to PR and design reviews for pipelines and models Support platform governance, observability, and best practices for data quality Work with adjacent teams (Ops & Reliability, Analytics, Product) to align on SLAs and data definitions Other duties as assigned

Requirements

  • Proficiency in Python and SQL for building and optimizing data pipelines
  • Hands-on experience with dbt for modeling and testing, and Terraform for infrastructure-as-code
  • Familiarity with modern data platforms: Snowflake, Materialize, Kafka, HVR, Fivetran, or Stitch
  • Understanding of data contracts, observability, and governance practices
  • Experience with CI/CD tools (GitHub Actions, GitLab CI, or similar)
  • Ability to translate business needs into scalable technical solutions
  • Bachelors’ degree in Math, Computer Science, Information Technology, or related preferred but not required
  • 1-3 years prior experience in a data engineering or data-heavy backend software engineering role

Nice To Haves

  • Knowledge of compliance frameworks (e.g., GDPR, CCPA, SOC 2) a plus

Responsibilities

  • Data Pipeline & Platform Development Build and maintain pipelines using dbt, Prefect, and Terraform
  • Develop and manage connectors across sources and targets including Kafka, RDBMs, and Snowflake.
  • Implement schema evolution, validation rules, and automated testing
  • Support high-availability and disaster recovery design for Snowflake and Materialize
  • Data Product Engineering Author and review schemas and data contracts for consistency and governance
  • Develop and optimize dbt models for Snowflake and Materialize analytics layers
  • Configure clusters and role-based access for shared environments
  • Document datasets to ensure discoverability and proper usage across teams
  • Stakeholder Collaboration Partner with BI developers, analysts, and business teams to deliver datasets that support reporting, dashboards, and integrations
  • Investigate and resolve data issues, ensuring durable fixes
  • Participate in design reviews to align technical solutions with business requirements
  • Collaboration & Standards Contribute to PR and design reviews for pipelines and models
  • Support platform governance, observability, and best practices for data quality
  • Work with adjacent teams (Ops & Reliability, Analytics, Product) to align on SLAs and data definitions
  • Other duties as assigned

Benefits

  • Quarterly Incentive Plan
  • 136 hours of Paid Time Off which equals 17 days for the year, that can be used for Sick Time or for Personal Use
  • Excellent Medical, Dental and Vision benefits
  • Tuition Reimbursement for education related to your job
  • Employee Referral Bonuses
  • Employee Recognition and Rewards Program
  • Paid Volunteer Time to support a cause that is close to your heart and contributes to our communities
  • Employee Discounts
  • Wellness Incentives that can go up to $100 per year for completing challenges, in addition to a discount on contribution rates
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service