GCP Data Engineer

Stefanini GroupDearborn, MI
Onsite

About The Position

Stefanini Group is hiring! Stefanini is looking for a GCP Data Engineer (Dearborn, MI) For quick apply, please reach out to Adil Khan at 248-728- 6424/ [email protected] We are looking for candidate who is responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately By collaborating with cross-functional stakeholders and optimizing cloud performance, you will ensure the data platform remains secure, cost-effective, and highly available to power critical business insights.

Requirements

  • GCP, Big Data, Data Warehousing, Artificial Intelligence & Expert Systems, API
  • Senior Data Engineer with 7+ years in data engineering and 10+ years in software with AI/ML
  • Proficiency in Python programming.
  • Experience deploying and managing services on Google Cloud Platform, including Compute Engine, Cloud Storage, IAM, and Cloud Functions. For example, designing and implementing a cloud-native application architecture using GKE (Google Kubernetes Engine) with Cloud SQL and Pub/Sub.
  • Experience working with large-scale data processing frameworks such as Apache Spark, Dataflow, or BigQuery.
  • Experience designing and maintaining data warehouse solutions (e.g., BigQuery, Snowflake, Redshift). For example, modeling a star schema for a retail analytics platform that supports reporting on sales, inventory, and customer behavior.
  • Experience developing or integrating AI/ML models and rule-based expert systems. For example, building a classification model using Vertex AI to predict customer churn, or implementing a rule engine that automates underwriting decisions.
  • Experience designing, building, and consuming RESTful or gRPC APIs. For example, developing a versioned REST API with OAuth 2.0 authentication that serves as the integration layer between a mobile application and backend microservices.
  • Solid experience with SQL for data manipulation and querying.
  • Hands-on experience with Google Cloud Platform (GCP) services relevant to AI/ML.
  • Basic understanding and practical experience with Machine Learning model fine-tuning.
  • Familiarity with data engineering concepts and practices.
  • Expertise in prompt engineering techniques for interacting with LLMs.
  • Experience with the OpenAI SDK.
  • Experience developing robust APIs, preferably with FastAPI.
  • Proficiency with version control systems (e.g., Git).
  • Experience with containerization technologies (e.g., Docker)
  • Bachelor's Degree

Nice To Haves

  • Google Cloud Platform
  • Familiarity with advanced GCP services beyond core compute and storage, such as Vertex AI, Dataflow, Cloud Composer (Airflow), and BigQuery ML. For example, using Cloud Composer to orchestrate scheduled data pipelines that feed into a BigQuery data warehouse.
  • Certification Program

Responsibilities

  • Architect and scale end-to-end data pipelines on GCP, transforming complex telemetry and enterprise data into high-quality, analytics-ready assets using Medallion architectures.
  • Lead the implementation of robust CI/CD workflows, rigorous data governance, and security controls while mentoring junior talent and driving engineering best practices.
  • Using Terraform, Git, and Airflow to ensure reproducible, secure, and cost-optimized cloud infrastructure.
  • Prioritizing data lineage, PII protection, and observability to maintain high trust in data assets.
  • Acting as a bridge between technical teams (Data Science, Security) and business stakeholders to deliver self-service analytics.
  • Strong understanding of Generative AI principles and architectures, including Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) systems.
  • Proven experience in building and deploying RAG systems, including the use of Vector Databases.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service