Utrecht Art Supply-posted 3 months ago
$120,000 - $150,000/Yr
Full-time • Senior
Highland Park, IL
101-250 employees
General Merchandise Retailers

The Technical Lead for Data Warehouse & Data Integration, will be expected to take ownership of delivery execution and modernization across our Data Warehouse (DW) and Reporting platforms. This role is ideal for someone who is not only a strong technical implementer but also a delivery-focused leader-ready to drive change, build clean and scalable systems, and lead by example. The Technical Lead for Data Warehouse & Data Integration, will be at the center of our data engineering transformation, leading a globally distributed team, working closely with business and technical stakeholders, and helping us evolve from legacy tools toward a modern, cloud-enabled data stack. This is a hands-on leadership role that will help shape the future of Blick's data capabilities. Helping to modernize our data pipelines and architecture, improve our ETL approaches, and deliver accurate, high-quality data to business users across the organization. From operational reports to strategic dashboards, the work will directly influence decision-making across merchandising, marketing, supply chain, and more.

  • Lead and guide engineers (primarily offshore) responsible for ETL development, data pipelines, and data warehouse loads.
  • Build and maintain workflows for data ingestion, cleansing, transformation, and transfer using tools like SSIS, Python, ADF, Fabric.
  • Support both legacy systems (on-prem SQL Server, SSRS) and modern platforms (Redshift, MS Fabric, Power BI), with a focus on enabling transition to Azure-based infrastructure.
  • Be highly hands-on-capable of coding, debugging, reviewing, and delivering alongside your team.
  • Drive sprint planning, backlog grooming, and progress reporting using Scrum practices and tools such as JIRA and dashboards.
  • Lead end-to-end delivery of data initiatives, from requirement intake to production deployment and stakeholder sign-off.
  • Proactively manage risks, remove blockers, and ensure delivery timelines and quality standards are met.
  • Establish and enforce clean data architecture principles and practices for all pipeline development.
  • Take ownership of data validation, cleansing, and transformation standards, especially where tools are currently lacking.
  • Bring in practical experience or advocate for frameworks (e.g., Great Expectations, or custom tooling) to elevate data reliability and observability.
  • Collaborate cross-functionally with Product Owners, Architects, Developers, Analysts, and Business Leaders to ensure that data delivery aligns with business needs.
  • Act as a key liaison to stakeholders, representing the technical roadmap, delivery plans, and status updates.
  • Partner with peers such as the Sr. BI Developer and Lead DBA to ensure alignment across reporting, infrastructure, and data modeling.
  • Be a driving force in modernizing our DW/ETL stack-championing new tools, clean architecture, and cloud-first principles.
  • Contribute to evaluation and implementation of platforms like Fabric, Azure Synapse, Redshift, Terraform, as appropriate.
  • Explore and implement AI-enabled tooling or automation to improve data engineering velocity, testing, and monitoring.
  • Bachelor's degree in computer science, Information Systems, or equivalent experience.
  • 7 to 10 years of experience in data warehousing, including 3+ years in a technical leadership role.
  • Expert-level proficiency in at least one leading ETL tool such as SSIS, Azure Data Factory (ADF), Talend, Informatica, or AWS Glue.
  • Strong experience with cloud platforms (AWS, Azure, or GCP), especially in data engineering and analytics services.
  • Strong experience with SQL (T-SQL, PL/SQL), and Python.
  • Strong experience with cloud data platforms like Redshift, and on-prem SQL Server.
  • Strong experience with Power BI and SSRS.
  • Proven ability to lead delivery, manage priorities, and report progress in a structured, business-facing manner.
  • Deep understanding of data quality, cleansing, and architecture best practices.
  • Strong communication skills and the ability to work independently in a fast-paced, cross-functional environment.
  • High ownership mindset-able to lead projects, coach team members, and make informed technical decisions.
  • Experience with modern data platforms like Redshift, Fabric, BigQuery, or Databricks.
  • Familiarity with DevOps/CI-CD practices for data pipeline deployment.
  • Understanding of data security, privacy, and governance frameworks.
  • DataOps practices or AI-assisted data validation/testing frameworks.
  • E-commerce or retail business intelligence use cases.
  • Medical/Dental/Vision Insurance
  • 401K & Profit Sharing Plan
  • Incentive Bonus Plans
  • Paid Holidays & Paid Time Off
  • Paid Parental Leave
  • Short-Term/Long-Term Disability
  • Training Opportunities
  • Basic & Optional Life Insurance
  • Employee Discount
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service