Senior Data Engineer

Hub International InsuranceChicago, IL
3d$135,000 - $150,000

About The Position

We are looking for a Senior Data Engineer – Enterprise Data & Analytics to join our DnA Infrastructure team. In this role, you will architect and build data pipelines and infrastructure that enable advanced analytics across the enterprise. You’ll play a critical part in modernizing our data platform, migrating from legacy systems, and delivering scalable, high-quality solutions powered by Google Cloud Platform (GCP) and infrastructure-as-code tools like Terraform .

Requirements

  • 8–15 years of progressive experience in data engineering, analytics infrastructure, or a related field.
  • Extensive experience with Google Cloud Platform , including BigQuery , Cloud Storage , Dataform , Cloud Composer , Cloud Run , and Cloud Functions .
  • Proficiency in Terraform for managing infrastructure across GCP environments.
  • Strong knowledge of SQL , data modeling (star/snowflake), and modern ELT pipeline development.
  • Hands-on experience with tools such as dbt , Python , and other data transformation frameworks.
  • Familiarity with Vertex AI , Cloud Notebooks , or similar ML platforms.
  • Strong understanding of CI/CD tools and practices, ideally including Cloud Build , GitHub Actions , or Terraform Cloud .
  • Experience with data visualization platforms such as Looker , Looker Studio , or similar BI tools.
  • Deep understanding of data governance, metadata management, and lineage using tools like Dataplex .
  • Excellent verbal and written communication skills, with the ability to explain complex technical topics to business stakeholders.
  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
  • Ability to thrive in a fast-paced environment with shifting priorities and global collaboration.
  • Willingness to occasionally work non-standard hours for critical deployments or support.

Nice To Haves

  • Certifications such as Google Cloud Professional Data Engineer , Cloud Architect , or HashiCorp Terraform Associate are preferred.

Responsibilities

  • Design, develop, and support scalable data models and ETL/ELT pipelines using GCP technologies .
  • Build and optimize cloud-native data solutions leveraging BigQuery , Dataform , Cloud Composer , Cloud Run , Cloud Functions , and Cloud Storage .
  • Use Terraform to manage GCP infrastructure as code, ensuring scalable, repeatable, and secure deployments.
  • Lead the decommissioning of legacy systems while enhancing the current data platform.
  • Contribute to machine learning and data science initiatives through integrations with Vertex AI and Cloud Notebooks .
  • Establish and maintain data quality and validation rules to ensure trust in enterprise data assets.
  • Document data pipelines, architecture, and Terraform modules to support team transparency and maintainability.
  • Coordinate migrations and deployments across environments (Development, UAT, Staging, Production) using modern CI/CD practices.
  • Collaborate cross-functionally within Agile/Scrum teams to deliver data solutions aligned with business needs.
  • Provide production support and performance tuning for data systems in GCP.
  • Mentor junior engineers and contribute to the professional development of consultants and interns.

Benefits

  • HUB International is proud to offer comprehensive benefit and total compensation packages which could include health/dental/vision/life/disability insurance, FSA, HSA and 401(k) accounts, paid-time-off benefits such as vacation, sick, and personal days, and eligible bonuses, equity and commissions for some positions.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service