Platform Engineer

Ford MotorDearborn, MI
Hybrid

About The Position

At Ford Motor Company, we believe freedom of movement drives human progress. We also believe in providing you with the freedom to define and realize your dreams. With our incredible plans for the future of mobility, we have a wide variety of opportunities for you to accelerate your career potential as you help us define tomorrow’s transportation. Enterprise Technology plays a critical part in shaping the future of mobility. If you’re looking for the chance to leverage advanced technology to redefine the transportation landscape, enhance the customer experience and improve people’s lives, this is the opportunity for you. Join us and challenge your IT expertise and analytical skills to help create vehicles that are as smart as you are.

Requirements

  • Bachelor’s degree or foreign equivalent in Computer Science, Information Technology or a related field and 5 years of progressive, post-baccalaureate experience in the job offered or a related occupation.
  • 2 years of experience with Utilizing Google Cloud Platform (GCP) services, including BigQuery, DataFlow, Pub/Sub, Compute Engine, Cloud Storage, Cloud SQL, and Cloud Functions, for designing and implementing enterprise-scale cloud infrastructure using Python, Terraform, or YAML configuration files.
  • 2 years of experience with Utilizing Terraform for Infrastructure as Code (IaC), including experience with HCL syntax, state management, modules, and automated provisioning of GCP resources using Shell scripts, Bash automation, or YAML configuration templates for consistent deployments.
  • 2 years of experience with Utilizing Tekton pipelines and Cloud Build for CI/CD automation, including experience with container image builds, Docker, Kubernetes, YAML pipeline definitions, Python automation scripts, and integration with Git repositories for continuous integration workflows.
  • 2 years of experience with Utilizing Cloud Scheduler for automated resource management, including experience with cron expressions, REST APIs, Python scheduling scripts, and integration with Cloud Functions, Pub/Sub, and database instances for cost optimization and operational automation.
  • 2 years of experience with Utilizing Cloud Composer (Apache Airflow) and DataFlow for data pipeline orchestration, including experience with Python DAGs, Apache Beam, workflow scheduling, and Shell script integration for automated ETL/ELT processing and data transformation workflows.
  • 2 years of experience with Utilizing GCP Identity and Access Management (IAM) for security implementation, including experience with role-based access control, service accounts, Terraform IAM configurations, Python automation scripts, and JSON policy definitions for enterprise security compliance.
  • 2 years of experience with Utilizing BigQuery, DataProc, and BigTable for data analytics platforms, including experience with SQL queries, Python data processing scripts, Apache Spark, Hadoop clusters, and automated data ingestion using Shell scripts and YAML configurations.
  • 2 years of experience with Utilizing data governance and compliance tools, including GCP Data Catalog and Informatica EDC, for implementing automated data lineage tracking using Python scripts, metadata management, Terraform resource provisioning, and regulatory compliance frameworks with YAML configuration management.

Responsibilities

  • Analyze and manipulate large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP).
  • Design the transformation and modernization on GCP, as well as landing data from source applications to GCP.
  • Design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform.
  • Work in collaborative environment including pairing and mobbing with other cross-functional engineers.
  • Work on a small agile team to deliver working, tested software.
  • Work effectively with fellow data engineers, product owners, data champions and other technical experts.
  • Develop exceptional Analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid Datawarehouse principles.
  • Design and deploy a pipeline with automated data lineage.
  • Identify, develop, evaluate and summarize Proof of Concepts to prove solutions.
  • Test and compare competing solutions and report out a point of view on the best solution.
  • Integration between GCP Data Catalog and Informatica EDC.
  • Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services.

Benefits

  • Immediate medical, dental, and prescription drug coverage
  • Flexible family care, parental leave, new parent ramp-up programs, subsidized back-up child care and more
  • Vehicle discount program for employees and family members, and management leases
  • Tuition assistance
  • Established and active employee resource groups
  • Paid time off for individual and team community service
  • A generous schedule of paid holidays, including the week between Christmas and New Year's Day
  • Paid time off and the option to purchase additional vacation time.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service