Data Engineer

BetterXCharleston, SC
16d

About The Position

About BetterX BetterX is the technology subsidiary of Better Collision Centers , created to bring modern, practical, AI-powered solutions to industries that have historically been underserved by technology. We build systems that simplify complex workflows, remove manual bottlenecks, and deliver clean, reliable data that teams can trust. Our tools support real-world operations in collision repair and other service-based businesses, with a strong focus on usability, performance, and reliability. We are a small, fast-moving team that values ownership, strong engineering fundamentals, and shipping production-ready systems that make a real impact. About the Role We are seeking a Data Engineer to design, build, and deliver a secure, scalable data pipeline connecting external webhooks, cloud data warehouses, and CRM systems. This is a temporary, project-based role with an expected duration of three to six months , with a strong possibility of extension or conversion to a full-time position based on performance and business needs. You will work closely with API, platform, AI, and analytics teams to ensure downstream systems receive clean, well-structured, and trustworthy data.

Requirements

  • 2–3+ years of professional experience with Python and SQL
  • Hands-on experience building modern data pipelines (batch and/or streaming)
  • Experience working with semi-structured data , including XML and JSON
  • Familiarity with cloud platforms such as AWS, GCP, or Azure
  • Experience with data warehouse platforms such as Snowflake
  • Strong understanding of data modeling and layered data architectures
  • Experience implementing data quality checks, retry logic, and reconciliation processes
  • Ability to work independently and deliver production-ready systems within a defined timeline
  • Strong communication skills and comfort working cross-functionally

Nice To Haves

  • Experience with collision repair, insurance, or automotive data
  • CRM data integration or API-based data delivery
  • Multi-tenant cloud architecture design
  • Exposure to data observability tools (Great Expectations, Datadog, Monte Carlo)
  • Familiarity with analytics tools such as Tableau or Power BI

Responsibilities

  • Design and implement event-driven, incremental data ingestion pipelines using webhooks and cloud data warehouse tools
  • Ingest and process high-volume XML and JSON data using idempotent, retry-safe logic
  • Build and maintain raw, parsed, and curated data layers in Snowflake or similar cloud warehouses
  • Implement data validation, reconciliation checks, and error-handling for critical pipelines
  • Monitor pipeline health, including latency, throughput, and error rates
  • Design and enforce secure, multi-tenant data isolation and role-based access control
  • Partner with API, AI, analytics, and business stakeholders to support CRM integrations and data delivery
  • Document pipeline architecture, schemas, data flows, and operational runbooks
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service