Internship Data & Analytics - Back-end

HunkemöllerDeerfield, IL
71d

About The Position

As an intern at Hunkemöller, you will deep dive into core datasets in BigQuery, enriching the data catalog with detailed column-level descriptions and business context. You will classify data assets, establish data lineage, and curate a comprehensive data dictionary to enhance data discovery and governance. Your role will involve quantifying and optimizing the cost/performance trade-offs of dbt materialization strategies (table, incremental, view) and BigQuery billing models (Standard vs. Editions) for high-traffic models. You will implement environment-specific optimization strategies and develop a pipeline to push dbt run metadata (e.g., last run time) to Looker, enabling intelligent cache refreshes and improved dashboard performance. Additionally, you will design and deploy a data quality monitoring loop that triggers Looker alerts based on failed dbt data quality tests, and measure incident response times. You will create a dbt-based process to automatically detect and remediate schema drift in source data, including generating alerts, model stops, or auto-migrations. Analyzing usage data to identify which dbt models drive the most value (e.g., most viewed dashboards, frequently run explores) and recommending models for deprecation or further optimization will also be part of your responsibilities. Lastly, you will assist in migrating legacy data models to the current data warehouse and support the deprecation of obsolete models.

Requirements

  • Pursuing a degree in Computer Science, Data Science, Information Management, or a related field.
  • Strong understanding of data management concepts, including data warehousing, data modeling, and data governance.
  • Experience or familiarity with SQL and cloud-based data platforms (ideally Google Cloud Platform).
  • Exposure to data visualization tools (e.g., Tableau, Power BI, Looker) is a plus.
  • Excellent communication and collaboration skills.
  • Self-motivated and eager to learn new technologies in a fast-paced environment.

Nice To Haves

  • Experience with data visualization tools like Tableau, Power BI, or Looker.

Responsibilities

  • Deep dive into core datasets in BigQuery.
  • Enrich the data catalog with detailed column-level descriptions and business context.
  • Classify data assets and establish data lineage.
  • Curate a comprehensive data dictionary to enhance data discovery and governance.
  • Quantify and optimize cost/performance trade-offs of dbt materialization strategies.
  • Implement environment-specific optimization strategies.
  • Develop and test a pipeline to push dbt run metadata to Looker.
  • Design and deploy a data quality monitoring loop that triggers Looker alerts.
  • Create a dbt-based process to detect and remediate schema drift in source data.
  • Analyze usage data to identify valuable dbt models.
  • Assist in migrating legacy data models to the current data warehouse.

Benefits

  • Internship allowance of 450€ per month (gross) with 8+ holiday days.
  • Travel allowance of max. 11,50€ per day.
  • Access to 25% staff discount.
  • Company laptop provided.
  • Participation in company gatherings like Townhalls and drinks.
  • Work from home policy allowing up to 3 days per week.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service