Truckstop-posted 8 days ago
Full-time • Mid Level
Boise, ID
1,001-5,000 employees

At Truckstop, we have transformed the entire freight-moving lifecycle with our SaaS solutions.  From freight matching to payments and everything in between, we are the trusted partner for carriers, brokers, and shippers alike. We lead this industry forward with our One Team mindset committing to principles such as assume positive intent, have each other’s back, and be your authentic self.  Our drive for greatness produces high expectations, yet our regard for humans is even higher. Join a team of brilliant minds and generous hearts who care deeply about other's success. Data Engineer III — Truckstop.com Truckstop.com is seeking a seasoned Data Engineer III to help strengthen and scale our modern data platform. In this role, you’ll build and optimize the pipelines, models, and infrastructure that power our analytics, product intelligence, and customer-facing solutions. We’re looking for someone who thrives in a fast-moving environment, collaborates well across teams, and embraces AI-driven development to move faster and smarter.

  • Design, build, and maintain scalable ELT pipelines and data models with Snowflake , dbt , and SQL .
  • Develop data infrastructure and platform components using Terraform , Python , and modern orchestration tools.
  • Work closely with engineering, analytics, and product teams to ensure data quality, reliability, and availability.
  • Optimize ingestion, transformation, and storage patterns across Postgres and other relational systems.
  • Partner with BI and analytics teams to enable self-service reporting in Domo (or other BI tools such as Metabase, Tableau, Power BI).
  • Manage and enhance data integration workflows using Airbyte and Matillion .
  • Drive architectural improvements around data governance, observability, automation, and scaling.
  • Leverage AI-powered coding and workflow tools (e.g., GitHub Copilot, Cursor, CodeWhisperer, OpenAI Assistants ) to accelerate delivery and improve code quality.
  • Strong experience with Snowflake , SQL , and dbt in a production environment.
  • Solid understanding of Terraform and infrastructure-as-code practices
  • Proficiency in Python for data processing, scripting, and automation.
  • Experience implementing and maintaining ELT pipelines and data integrations.
  • Familiarity with Postgres or other relational databases.
  • Hands-on experience with BI or analytics tools.
  • Excellent communication skills and the ability to work cross-functionally.
  • Experience with Airbyte , Matillion , or similar ETL/ELT platforms is highly valued.
  • Demonstrated use of AI tools (e.g., Copilot, Cursor, Codex, Claude Code ) in day-to-day engineering work.
  • Bonus: Background in supply chain, freight, or logistics .
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service