Foundry Data Integrations Engineer II

Lear CorporationSouthfield, MI

About The Position

As a member of the IT Central Foundry Center of Excellence (COE), the Foundry Data Integrations Engineer will be responsible for highly skilled, technical work focused on Foundry Data Ingestions. This role is accountable for designing, building, and supporting enterprise data integrations and ingestion pipelines within Palantir Foundry to ensure reliable, scalable, and production-ready data across Lear’s global manufacturing, supply chain, and administrative environments.

Requirements

  • Bachelor's degree in computer science, Engineering, Data Science, or a related field
  • 1+ year of experience working with Palantir Foundry
  • 2+ years of experience in data engineering or ETL roles with heavy SQL usage
  • Strong proficiency in SQL including complex joins, aggregations, and performance tuning
  • Hands-on experience integrating Excel files, PDFs, APIs, and relational databases
  • Experience working with DB2, SQL Server, PostgreSQL, MySQL, Progress, Oracle, and/or MariaDB
  • Experience working with REST APIs and structured JSON data
  • Familiarity with webhooks and event-driven integration patterns
  • Experience with Python and Spark SQL
  • Familiarity with Git-based workflows, agile development practices, and production support models
  • Strong analytical, problem-solving, and communication skills

Responsibilities

  • Designing, building, and maintaining Palantir Foundry ingestion pipelines, data connections, and source-to-prepared datasets
  • Developing, optimizing, and maintaining SQL statements for extraction, transformation, aggregation, validation, and performance-efficient ingestion
  • Integrating data from file-based sources including Excel spreadsheets, CSV files, and PDF documents
  • Integrating relational databases including DB2, SQL Server, PostgreSQL, MySQL, Progress, Oracle, and MariaDB
  • Building and supporting API-based integrations using REST and JSON
  • Supporting event-driven integrations including webhooks and trigger-based ingestion patterns
  • Monitoring ingestion health, pipeline execution, data freshness, and schema changes
  • Troubleshooting ingestion failures, connectivity issues, and data quality problems
  • Documenting integration logic, data flows, and operational procedures
  • Partnering with engineering, finance, operations, and IT teams to onboard new data sources and support digital transformation initiatives
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service