Data Engineer (AI-RPA)

PadnosGrandville, MI
27dOnsite

About The Position

PADNOS is seeking a Data Engineer on our Data and Software team who thrives at the intersection of data, automation, and applied AI. This role builds intelligent data pipelines and robotic process automations (RPAs) that connect systems, streamline operations, and unlock efficiency across the enterprise. You'll design and develop pipelines using Python, SQL Server, and modern APIs-integrating services such as OpenAI, Anthropic, and Azure ML-to drive automation and accelerate business processes. Your work will extend beyond traditional data engineering, applying AI models and API logic to eliminate manual effort and make data more actionable across teams. You will report directly to IT Manager, at PADNOS Corporate in Grandville, MI. This is an in-person role based in Grandville, Michigan. Must reside within daily commuting distance of Grandville, Michigan. We do not relocate, sponsor visas, or consider remote applicants.

Requirements

  • Bachelor's degree or equivalent experience in data engineering, computer science, or software development.
  • Must have personally owned an automated pipeline end-to-end (design → build → deploy → maintain).
  • Minimum 3 years hands-on experience building production data pipelines using Python and SQL Server. Contract, academic, bootcamp, or coursework experience does not qualify.
  • Intermediate to advanced Python development skills, particularly for data and API automation.
  • Experience working with RESTful APIs and JSON data structures.
  • Familiarity with AI/ML API services (OpenAI, Anthropic, Azure ML, etc.) and their integration into data workflows.
  • Knowledge of SQL Server performance tuning and query optimization.
  • Familiarity with Git and CI/CD workflows for data pipeline deployment.

Nice To Haves

  • Experience with modern data stack components such as Fivetran, dbt, or similar tools preferred.
  • Bonus: Experience deploying or maintaining RPA or AI automation solutions.

Responsibilities

  • Design and develop automated data pipelines that integrate AI and machine learning services to process, enrich, and deliver high-value data for analytics and automation use cases.
  • Build, maintain, and optimize SQL Server ELT workflows and Python-based automation scripts.
  • Connect to external APIs (OpenAI, Anthropic, Azure ML, and other SaaS systems) to retrieve, transform, and post data as part of end-to-end workflows.
  • Partner with business stakeholders to identify manual workflows and translate them into AI-enabled automations.
  • Work with software developers to integrate automation logic directly into enterprise applications.
  • Implement and monitor data quality, reliability, and observability metrics across pipelines.
  • Apply performance tuning and best practices for database and process efficiency.
  • Develop and maintain reusable Python modules and configuration standards for automation scripts.
  • Support data governance and version control processes to ensure consistency and transparency across environments.
  • Collaborate closely with analytics, software, and operations teams to prioritize and deliver automation solutions that create measurable business impact.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Industry

Primary Metal Manufacturing

Number of Employees

501-1,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service