Data Engineer

FordDearborn, MI
$140,629 - $184,135Hybrid

About The Position

At Ford Motor Company, we believe freedom of movement drives human progress. We also believe in providing you with the freedom to define and realize your dreams. With our incredible plans for the future of mobility, we have a wide variety of opportunities for you to accelerate your career potential as you help us define tomorrow’s transportation. The people of Ford Motor Credit Company have a 60-year commitment to helping put people behind the wheels of great Ford and Lincoln vehicles. By partnering with dealerships, we provide financing, personalized service and professional expertise to five thousand dealers and more than four million customers in over 100 countries around the world. If you’re customer-focused, driven and seeking the opportunity to experience exciting challenges and growth, look no further. What you'll be able to do: Data Engineer - positions offered by Ford Motor Credit Company LLC (Dearborn, Michigan). Note, this is a hybrid position whereby the employee will work both from home and from the anticipated worksite. Hence, the employee must live within a reasonable commuting distance from the anticipated worksite. Design and build production data engineering solutions to deliver reusable patterns using Google Cloud Platform (GCP) services: Big Query, Dataflow, DataForm, Astronomer, Data Fusion, DataProc, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Artifact Registry, GCP APIs, Cloud build and App Engine, as well as real-time data streaming platforms like Apache Kafka, GCP Pub/Sub and Qlik Replicate. Collaborate with stakeholders and cross-functional teams to gather and define data requirements, ensuring alignment with business objectives. Design and implement batch, real-time streaming, scalable, and fault-tolerant solutions for data ingestion, processing, and storage for enterprise solutions in cloud-native tools. Perform necessary data mapping, impact analysis for changes, root cause analysis, data lineage activities, and document information flows. Develop and maintain documentation for data engineering processes, standards, and best practices, ensuring knowledge transfer and ease of system maintenance. Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures. Provide production support by addressing production issues, per SLAs. Implement an enterprise data governance model and actively promote the concept of data - protection, sharing, reuse, quality, and standards to ensure the integrity and confidentiality of data. Work in an agile product team to deliver code frequently using Test Driven Development (TDD), continuous integration, and continuous deployment (CI/CD) by utilizing Tekton and Jenkin tools. Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Continuously enhance FMCC domain knowledge, stay current on the latest data engineering practices, and contribute to the company's technical direction while maintaining a customer-centric approach.

Requirements

  • Bachelor’s degree or foreign equivalent in Computer Engineering or a related field and 5 years of progressive, post-baccalaureate experience in the job offered or a related occupation.
  • 5 years of experience in each of the following skills is required: 1. Advanced SQL Development using Teradata, GCP BigQuery, and other relational database management systems including SQL Server, Postgres, or Oracle for building and optimizing complex queries and ETL processes. 2. Analytics and Data Product Development with a strong focus on designing, developing, and maintaining data warehouse solutions tailored for analytics, reporting, and data product consumption.
  • 3 years of experience in each of the following skills is required: 1. Designing and implementing production-grade solutions on Cloud Platforms including Google Cloud Platform. 2. Utilizing GCP Native Services including Big Query and Google Cloud Storage for building scalable data infrastructure. 3. Developing and managing workflow orchestration with Apache Airflow for scheduling, monitoring, and ensuring reliability of complex data pipelines. 4. Utilizing GitHub for source code management. 5. Implementing Infrastructure as Code using Terraform to provision, configure, and maintain cloud infrastructure in a repeatable and automated manner for continuous integration and multi-cloud continuous delivery to develop and maintain CI/CD pipelines using Tekton.

Responsibilities

  • Design and build production data engineering solutions to deliver reusable patterns using Google Cloud Platform (GCP) services: Big Query, Dataflow, DataForm, Astronomer, Data Fusion, DataProc, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Artifact Registry, GCP APIs, Cloud build and App Engine, as well as real-time data streaming platforms like Apache Kafka, GCP Pub/Sub and Qlik Replicate.
  • Collaborate with stakeholders and cross-functional teams to gather and define data requirements, ensuring alignment with business objectives.
  • Design and implement batch, real-time streaming, scalable, and fault-tolerant solutions for data ingestion, processing, and storage for enterprise solutions in cloud-native tools.
  • Perform necessary data mapping, impact analysis for changes, root cause analysis, data lineage activities, and document information flows.
  • Develop and maintain documentation for data engineering processes, standards, and best practices, ensuring knowledge transfer and ease of system maintenance.
  • Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures.
  • Provide production support by addressing production issues, per SLAs.
  • Implement an enterprise data governance model and actively promote the concept of data - protection, sharing, reuse, quality, and standards to ensure the integrity and confidentiality of data.
  • Work in an agile product team to deliver code frequently using Test Driven Development (TDD), continuous integration, and continuous deployment (CI/CD) by utilizing Tekton and Jenkin tools.
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
  • Continuously enhance FMCC domain knowledge, stay current on the latest data engineering practices, and contribute to the company's technical direction while maintaining a customer-centric approach.

Benefits

  • Immediate medical, dental, and prescription drug coverage
  • Flexible family care, parental leave, new parent ramp-up programs, subsidized back-up child care and more
  • Vehicle discount program for employees and family members, and management leases
  • Tuition assistance
  • Established and active employee resource groups
  • Paid time off for individual and team community service
  • A generous schedule of paid holidays, including the week between Christmas and New Year's Day
  • Paid time off and the option to purchase additional vacation time.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service