Cloud Data Engineer

HunkemöllerDeerfield, IL
4d

About The Position

We believe in an "AI-first" development approach. We are looking for an agile learner who leverages modern AI tools (like Gemini, Claude, and Copilot) alongside strong Python and GCP engineering skills to accelerate development and creatively solve complex integration challenges. The Role This position is tailored for a Data Engineer who loves the "plumbing" of data—connecting APIs, orchestrating workflows, and moving data seamlessly between systems. You will own our upstream ingestion framework, manage our Google Cloud infrastructure (Cloud Composer, Dataflow, Cloud Run), and power our Reverse ETL processes to ensure our operational systems have the data they need. Hunkemöller is looking for a Cloud Data Engineer to take a core role in our digital data transformation. This is a fantastic opportunity for a forward-thinking, adaptable data engineering professional to help build and scale the ingestion and infrastructure backbone of our next-generation data platform on GCP Hunkemöller’s mission is to be a much loved, social and inclusive brand - powered by our people. We have over 900 stores in 15 countries and we are growing. Our plans to expand both in Europe and beyond provide exceptional opportunities for those with a passion for retail. Indeed, passion is one our six values: fun, inclusive, passionate, sexy, in-touch and inspiring. These values drive all we do and together with our growing commitment to the well-being of people, planet and communities, we are building a brand aligned to the demands of our global customers and those that work with us. Our USPs are many however, at the heart of what makes us special is a commitment to World-Class service, whatever the channel and wherever the store. This commitment extends to how we work with each other and creating a World-Class working environment. Hunkemöller is certified TOP EMPLOYER of the Netherlands (for the third time in a row) & Germany 2020, which underlines our people initiatives and achievements.

Requirements

  • Core Data Engineering Experience: 3 to 5 years of hands-on experience in data engineering, with a strong focus on data integration, APIs, and pipeline architecture.
  • AI Adaptability & Continuous Learning: A strong desire to learn quickly and adapt to new technologies. You embrace modern development practices and are highly comfortable using AI tools as a force multiplier in your daily work.
  • API Integration & Python: Strong Python programming skills with a proven track record of building custom API extractors, handling pagination, rate limiting, and working with REST/GraphQL endpoints.
  • GCP Service Expertise: Hands-on experience with Google Cloud Platform's ecosystem, specifically Cloud Composer, Dataflow, Cloud Run, and Firestore.
  • Code Quality & CI/CD: Proficient in writing clean, well-documented, and tested code (e.g., pytest), with strong experience using Git, Docker, and CI/CD pipelines.
  • English Proficiency: Excellent written and verbal English communication skills.

Nice To Haves

  • Experience managing Infrastructure as Code (specifically Terraform) or working with downstream data transformation tools (dbt).

Responsibilities

  • Build Ingestion Pipelines: Design, develop, and deploy robust data ingestion pipelines from various third-party APIs, webhooks, and source systems into Google Cloud.
  • AI-Augmented Engineering: Actively leverage advanced AI coding assistants to accelerate pipeline development, generate boilerplate API connection code, debug complex scripts, and automate repetitive tasks.
  • GCP Infrastructure & Orchestration: Build and manage data workflows using Cloud Composer (Airflow), and leverage Cloud Run and Dataflow for scalable, containerized data processing.
  • Drive Reverse ETL: Architect and maintain the data pipelines that push refined data from BigQuery back into our operational platforms (marketing tools, CRM, etc.) to drive business action.
  • Manage Operational Databases: Utilize Firestore and other NoSQL/relational databases to support operational data needs and microservices.
  • Collaborate and Learn: Partner with our data modeling specialists to ensure smooth handoffs between ingestion and transformation. Participate in code reviews and continuously share new engineering best practices.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service