Manager, Data Engineering

Ford Motor CompanyDearborn, MI
Hybrid

About The Position

Ford made history and now works to transform the future for its customers, communities, and families, with employees seeing their work on the road daily. The Global Data Insight & Analytics team believes data tells the real story, advising leadership on business conditions, customer needs, and the competitive landscape to drive evidence-based decision-making. As a Design Cost & Complexity (DCC) Analytics Data Engineering Manager, you will lead a team responsible for building and maintaining data pipelines that support DCC Analytics. Your team will design, develop, and maintain foundational data assets and services for Artificial Intelligence, Data Science, and Software Engineering. You will also contribute to Ford’s Data Hub strategy, focusing on domain-focused warehouses as a single source of truth for the enterprise, and champion data engineering standardization across PLMA datasets. This role offers an experienced data engineering professional the opportunity to guide the team in designing effective data curation solutions, prioritizing tasks, making timely decisions, and ensuring high-quality results. Expertise in data governance, customer consent, and security standards is crucial for responsible and ethical data operations.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field.
  • 8+ years of experience in complex data environments, demonstrating increased responsibilities and achievements with:
  • Expertise in programming languages such as Python or Scala, and strong SQL skills.
  • Experience with ETL/ELT processes, data warehousing, and data modeling.
  • Experience with CI/CD pipelines, Docker, Git/Gerrit, and experience designing resilient deployment strategies and sophisticated release management.
  • Familiarity of data governance, privacy, quality, and monitoring.
  • Proven experience in implementing sophisticated testing strategies, driving quality tool adoption, establishing comprehensive code review processes, and setting observability standards with advanced monitoring and proactive alerting.
  • 5+ years of experience within the automotive industry or related product development environments and product lifecycle management.
  • 5+ years of experience in leading software or data engineering teams, with a focus on team development and project success.
  • 5+ years of experience in Big Data environments or expertise with Big Data tools, including:
  • Data processing frameworks and data modeling.
  • In-depth knowledge and practical experience with Google Cloud Platform services.
  • Proven experience in monitoring and optimizing costs and compute resources in hyperscaler platforms.
  • Significant experience leveraging Generative AI and LLMs to optimize data engineering workflows (e.g., automated code generation, documentation, or metadata management).

Nice To Haves

  • Master's degree in Computer Science, Engineering, or a related field.
  • Expertise in GCP based data engineering services like BQ, Dataflow, Airflow, Dataform, Datastream, Apache Beam, Cloud Run, Cloud Functions.
  • Familiarity with automotive Product Development processes, including program planning, design validation, and cross-functional collaboration across engineering, manufacturing, and supplier teams to deliver data-driven insights at each lifecycle stage.
  • Experience in managing and scaling serverless applications and clusters, focusing on resource optimization and robust monitoring and logging strategies.
  • Proficiency in unstructured data ingestion, including experience with data modeling and preparation techniques to support AI and machine learning workloads.
  • Experience with AI architecture and AI enabling tech (graph database, vector database, etc).
  • Familiarity with data visualization tools (e.g., Power BI, Tableau).
  • Working knowledge of ontology, semantic modeling, and related technologies.

Responsibilities

  • Lead, mentor, and develop a high performing team of local and remote Portfolio Data Engineers, fostering a culture of collaboration, innovation, and continuous improvement.
  • Strategically prioritize and manage team workloads, ensuring effective task allocation and resource capacity to support team goals.
  • Provide expert technical guidance and mentorship, ensuring adherence to best practices, coding standards, and architectural guidelines.
  • Act as the Chief Data Technical Anchor for the PLMA domain, resolving critical incidents through Root Cause Analysis (RCA) and implementing permanent, resilient architectural fixes.
  • Oversee the design, development, maintenance, scalability, reliability, and performance of data platform pipelines, aligning them with business needs and strategic objectives.
  • Contribute to the long-term strategic direction of the Data Platform by proactively identifying opportunities for best practice adoption and standardization.
  • Champion data quality, governance, and security standards, ensuring compliance and safeguarding sensitive data assets.
  • Enhance efficiency and reduce redundancy by consolidating common tasks across teams.
  • Effectively communicate decisions to stakeholders, building strong relationships and ensuring alignment on data initiatives.
  • Maintain awareness of industry trends and emerging technologies to inform technical decisions.
  • Lead the implementation of customer requests into data assets, ensuring optimized design and code development.
  • Guide the team in delivering scalable, robust data solutions and contribute hands-on to critical projects, including design and code reviews.
  • Lead technical decisions that drive data innovation and resilience.
  • Demonstrate full stack cloud data engineering expertise, covering automation, versioning, ingestion, integration, transformation, optimization, and data modeling.
  • Engage in agile planning, including scope, work breakdown structure, as well as roadblock resolution.
  • Design solutions for cost and consumption optimization, scalability, and performance.
  • Collaborate with Data Architecture and stakeholders on solution design, data consolidation, retention, purpose of use, compliance, and audit requirements.
  • Drive engineering excellence by establishing and monitoring SWE-centric quality metrics (including DORA metrics and P99 latency targets).

Benefits

  • Immediate medical, dental, vision and prescription drug coverage
  • Flexible family care days, paid parental leave, new parent ramp-up programs, subsidized back-up child care and more
  • Family building benefits including adoption and surrogacy expense reimbursement, fertility treatments, and more
  • Vehicle discount program for employees and family members and management leases
  • Tuition assistance
  • Established and active employee resource groups
  • Paid time off for individual and team community service
  • A generous schedule of paid holidays, including the week between Christmas and New Year’s Day
  • Paid time off and the option to purchase additional vacation time.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service