Data Engineer

HunkemöllerDeerfield, IL
Hybrid

About The Position

Hunkemöller is looking for a Data Engineer to take a core role in our digital data transformation. This is a fantastic opportunity for a forward-thinking, adaptable data engineering professional to help build our next-generation, cloud-native data platform on GCP. At Hunkemöller, you will be part of a collaborative environment where business and IT work closely together to support our retail and omnichannel setup. We believe in an "AI-first" development approach. We are looking for an agile learner who leverages modern AI tools (like Gemini, Claude, etc.) alongside traditional engineering skills (GCP, dbt, Python) to accelerate development and solve complex problems creatively. This position is for a Data Engineer to help implement and scale Hunkemöller's enterprise data warehouse and data mesh on Google Cloud. You will collaborate with a team of internal and external engineers, driving key data transformation initiatives. The right candidate will be excited by the prospect of building a scalable data platform to support next-generation analytics, and will actively seek out ways to make our data processes faster and more efficient.

Requirements

  • 1 to 3 years of hands-on experience in data engineering, with a strong track record of building data pipelines and working with data warehouse solutions.
  • A strong desire to learn quickly and adapt to new technologies. You embrace modern development practices and are comfortable using AI tools as a force multiplier in your daily work.
  • Advanced skills in writing, optimizing, and debugging complex SQL queries for data manipulation and analysis.
  • Solid knowledge of cloud data services, preferably on Google Cloud Platform (BigQuery, Dataflow, etc.).
  • Experience with Python and applying software engineering principles to data solutions.
  • Hands-on experience with dbt is highly preferred.
  • Proficient in writing clean, well-documented, and tested code, with strong experience using Git and CI/CD workflows.
  • Excellent written and verbal English communication skills.

Nice To Haves

  • Hands-on experience with dbt

Responsibilities

  • Develop, test, and maintain robust, scalable data pipelines using SQL, dbt, and cloud technologies (GCP), ensuring high standards of data quality and reliability.
  • Actively leverage advanced AI coding assistants and LLMs to accelerate pipeline development, debug complex code, generate documentation, and automate repetitive tasks.
  • Assist in the implementation of scalable data models (e.g., star schemas, data vaults) within our enterprise data warehouse (BigQuery).
  • Build and maintain our Google Cloud Platform (GCP) data infrastructure, focusing on automation, security, and performance improvements.
  • Partner with Product, Data, and Design teams to resolve technical data issues.
  • Participate in code reviews and continuously learn and share new engineering best practices.
  • Build and optimize data platforms that power our BI, Data Science, and AI solutions, ensuring data is accessible, reliable, and ready for analysis.

Benefits

  • 25 days of annual leave, with the option to buy or sell up to 4 additional days
  • Hybrid work model, combining office and remote working
  • Possibility to work from abroad for up to 2 weeks per year
  • An international work environment, working with teams across different countries
  • Travel allowance to support commuting costs
  • Access to the Hunkemöller Academy for professional development
  • 25% employee discount on all Hunkemöller products
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service