Manager, Data Engineering

Ford MotorDearborn, MI
Hybrid

About The Position

We made history and now we work to transform the future – for our customers, our communities and our families. You'll see your work on the road every day, helping people move freely and pursue their dreams. At Ford, you can build more than vehicles. Come build what matters. Do you believe data tells the real story? We do! Redefining mobility requires quality data, metrics and analytics, as well as insightful interpreters and analysts. That's where Global Data Insight & Analytics makes an impact. We advise leadership on business conditions, customer needs and the competitive landscape. With our support, key decision makers can act in meaningful, positive ways. Join us and use your data expertise and analytical skills to drive evidence-based, timely decision making. As a Design Cost & Complexity (DCC) Analytics Data Engineering Manager, you will be at the heart of our data ecosystem, leading the team that builds and maintains data pipelines that support DCC Analytics. You and your team will be responsible for designing, developing, and maintaining the foundational data assets and services that empower Artificial Intelligence, Data Science and Software Engineering. You'll also play a pivotal role in the collaboration of Ford’s Data Hub strategy, contributing to domain focused warehouses that represent the single source of truth for the enterprise. You'll be a champion for data engineering standardization by providing design input on new data engineering capabilities and implementing those capabilities across the PLMA datasets. This is a fantastic opportunity for an experienced data engineering professional to make a significant impact. You'll be responsible for guiding the team in designing effective data curation solutions, prioritizing tasks, making timely decisions, and ensuring the delivery of high-quality results. Your expertise in data governance, customer consent, and security standards will be crucial in ensuring we operate responsibly and ethically with data.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field.
  • 8+ years of experience in complex data environments, demonstrating increased responsibilities and achievements with:
  • Expertise in programming languages such as Python or Scala, and strong SQL skills.
  • Experience with ETL/ELT processes, data warehousing, and data modeling.
  • Experience with CI/CD pipelines, Docker, Git/Gerrit, and experience designing resilient deployment strategies and sophisticated release management.
  • Familiarity of data governance, privacy, quality, and monitoring.
  • Proven experience in implementing sophisticated testing strategies, driving quality tool adoption, establishing comprehensive code review processes, and setting observability standards with advanced monitoring and proactive alerting.
  • 5+ years of experience within the automotive industry or related product development environments and product lifecycle management.
  • 5+ years of experience in leading software or data engineering teams, with a focus on team development and project success.
  • 5+ years of experience in Big Data environments or expertise with Big Data tools, including:
  • Data processing frameworks and data modeling.
  • In-depth knowledge and practical experience with Google Cloud Platform services.
  • Proven experience in monitoring and optimizing costs and compute resources in hyperscaler platforms.
  • Significant experience leveraging Generative AI and LLMs to optimize data engineering workflows (e.g., automated code generation, documentation, or metadata management).

Nice To Haves

  • Master's degree in Computer Science, Engineering, or a related field.
  • Expertise in GCP based data engineering services like BQ, Dataflow, Airflow, Dataform, Datastream, Apache Beam, Cloud Run, Cloud Functions
  • Familiarity with automotive Product Development processes, including program planning, design validation, and cross-functional collaboration across engineering, manufacturing, and supplier teams to deliver data-driven insights at each lifecycle stage
  • Experience in managing and scaling serverless applications and clusters, focusing on resource optimization and robust monitoring and logging strategies.
  • Proficiency in unstructured data ingestion, including experience with data modeling and preparation techniques to support AI and machine learning workloads.
  • Experience with AI architecture and AI enabling tech (graph database, vector database, etc)
  • Familiarity with data visualization tools (e.g., Power BI, Tableau).
  • Working knowledge of ontology, semantic modeling, and related technologies

Responsibilities

  • Lead, mentor, and develop a high performing team of local and remote Portfolio Data Engineers, fostering a culture of collaboration, innovation, and continuous improvement.
  • Strategically prioritize and manage team workloads, ensuring effective task allocation and resource capacity to support team goals.
  • Provide expert technical guidance and mentorship, ensuring adherence to best practices, coding standards, and architectural guidelines.
  • Act as the Chief Data Technical Anchor for the PLMA domain, resolving critical incidents through Root Cause Analysis (RCA) and implementing permanent, resilient architectural fixes.
  • Oversee the design, development, maintenance, scalability, reliability, and performance of data platform pipelines, aligning them with business needs and strategic objectives.
  • Contribute to the long-term strategic direction of the Data Platform by proactively identifying opportunities for best practice adoption and standardization.
  • Champion data quality, governance, and security standards, ensuring compliance and safeguarding sensitive data assets.
  • Enhance efficiency and reduce redundancy by consolidating common tasks across teams.
  • Effectively communicate decisions to stakeholders, building strong relationships and ensuring alignment on data initiatives.
  • Maintain awareness of industry trends and emerging technologies to inform technical decisions.
  • Lead the implementation of customer requests into data assets, ensuring optimized design and code development.
  • Guide the team in delivering scalable, robust data solutions and contribute hands-on to critical projects, including design and code reviews.
  • Lead technical decisions that drive data innovation and resilience.
  • Demonstrate full stack cloud data engineering expertise, covering automation, versioning, ingestion, integration, transformation, optimization, and data modeling.
  • Engage in agile planning, including scope, work breakdown structure, as well as roadblock resolution.
  • Design solutions for cost and consumption optimization, scalability, and performance.
  • Collaborate with Data Architecture and stakeholders on solution design, data consolidation, retention, purpose of use, compliance, and audit requirements.
  • Drive engineering excellence by establishing and monitoring SWE-centric quality metrics (including DORA metrics and P99 latency targets).

Benefits

  • Immediate medical, dental, vision and prescription drug coverage
  • Flexible family care days, paid parental leave, new parent ramp-up programs, subsidized back-up child care and more
  • Family building benefits including adoption and surrogacy expense reimbursement, fertility treatments, and more
  • Vehicle discount program for employees and family members and management leases
  • Tuition assistance
  • Established and active employee resource groups
  • Paid time off for individual and team community service
  • A generous schedule of paid holidays, including the week between Christmas and New Year’s Day
  • Paid time off and the option to purchase additional vacation time.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service