CX Products Engineer

FordDearborn, MI
1d

About The Position

Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment of Data Platform. Implement methods for automation of all parts of the pipeline to minimize labor in development and production. Provide technical guidance, mentorship, and code-level support to the development team Work with the team to develop and implement solutions using GCP tools (BigQuery, GCS, Dataflow, Dataproc, etc.) Ensure adherence to security, legal, and Ford standard/policy compliance Analyze and optimize the performance of data pipelines and related services to ensure efficiency and cost-effectiveness when dealing with large datasets. Participate in code reviews and contribute to improving code quality Ensure implementation of DevSecOps and software craftsmanship practices (CI/CD, TDD, Pair Programming) Established and active employee resource groups

Requirements

  • Minimum 5 Years of Experience in Java/python in-depth
  • Minimum 5 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL/ELT principles and write complex sql queries.
  • Minimum 5 Years of GCP experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Google Cloud Storage, PubSub, Data Fusion, Dataproc
  • Minimum 2 years of experience in development using Data warehousing, Big Data
  • 3 Years of experience deploying google cloud services using Terraform
  • Bachelors or masters in required field
  • Experience working with AI/LLM models, leveraging Generative AI
  • Understands data architectures and design independent of the technology
  • Experience with Big Query, SQL
  • Experience with Python, Apache Airflow

Nice To Haves

  • Exceptional problem solving and communication skills and management of multiple stakeholders
  • Experience in working with Agile and Lean methodologies
  • Experience with Test-Driven Development
  • Exposure to AI/LLM

Responsibilities

  • Work as part of an implementation team from concept to operations
  • Provide deep technical subject matter expertise for successful deployment of Data Platform
  • Implement methods for automation of all parts of the pipeline to minimize labor in development and production
  • Provide technical guidance, mentorship, and code-level support to the development team
  • Work with the team to develop and implement solutions using GCP tools (BigQuery, GCS, Dataflow, Dataproc, etc.)
  • Ensure adherence to security, legal, and Ford standard/policy compliance
  • Analyze and optimize the performance of data pipelines and related services to ensure efficiency and cost-effectiveness when dealing with large datasets
  • Participate in code reviews and contribute to improving code quality
  • Ensure implementation of DevSecOps and software craftsmanship practices (CI/CD, TDD, Pair Programming)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service