Data Engineer (55225)

PREMIUM GUARD INCORPORATEDLockbourne, OH
3dHybrid

About The Position

Premium Guard Inc. was founded in 1996 as a pioneer in the global manufacture of aftermarket automotive products. From the very beginning, our focus has been to deliver a complete, turnkey solution with industry-leading application coverage and quality private label programs for all segments of the automotive aftermarket in North America. Premium Guard Inc is excited to announce that we’re hiring a Data Engineer for either our location in Columbus Ohio or Memphis, TN. This is a hybrid role with a minimum of 2 days on site per week. The Data Engineer will be responsible for building reliable, scalable data pipelines and platforms that support analytics, reporting, and future advanced initiatives. This role focuses primarily on data engineering fundamental - data ingestion, transformation, storage, quality, and performance—with limited initial responsibility for AI/ML, while providing a clear growth path into advanced analytics and intelligent systems.

Requirements

  • 5+ years of experience in data engineering or related roles
  • Strong proficiency in Python and SQL
  • Hands-on experience with Databricks, SQL Server, and cloud platforms (AWS and/or Azure)
  • Deep understanding of data modeling, ETL/ELT pipelines, and orchestration concepts
  • Familiarity with CI/CD pipelines and version control tools (e.g., GitHub)
  • Excellent problem-solving, communication, and collaboration skills

Nice To Haves

  • Experience in automotive, manufacturing, or supply chain environments
  • Familiarity with dbt, Airflow, Azure Data Factory
  • Exposure to infrastructure-as-code tools (Terraform, CDK)
  • Knowledge of data cataloging, lineage, and monitoring platforms
  • Introductory exposure to AI/ML concepts and workflows

Responsibilities

  • Design, build, and maintain robust data pipelines supporting new products, customers, and data sources
  • Develop scalable data models and ETL/ELT processes using Databricks, SQL Server, and Python
  • Integrate data from internal systems (ERP, logistics, sales, operations) and external partners
  • Optimize data workflows for performance, reliability, scalability, and cost efficiency across AWS and Azure
  • Implement best practices for data quality, observability, governance, and documentation
  • Support data-driven decision-making across operations, supply chain, product, and leadership teams
  • Monitor pipeline health and resolve failures proactively
  • Apply change management and release controls
  • Maintain system documentation and technical standards
  • Collaborate with BI analysts, developers, and business stakeholders
  • Prepare and curate datasets for analytics and future ML initiatives
  • Assist with feature engineering and data preprocessing efforts
  • Participate in pilot projects involving predictive analytics and automation as the organization evolves
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service