Analytics Engineering Specialist

HDI Global Insurance CompanyChicago, IL
7h$100,000 - $120,000

About The Position

We are seeking an Analytics Engineer to join our Data & Insights team and play a key role in building and maintaining our modern analytics platform. This role sits at the intersection of data engineering and analytics, with a strong focus on dbt, Snowflake, Python-based ETL, OpenFlow or similar tools and analytics-ready data modeling. This role will play a key part in migrating our enterprise data warehouse from Azure SQL Database (SQL Server) to Snowflake using dbt as the primary transformation framework. The ideal candidate is comfortable owning data transformations end-to-end, partnering closely with data analysts and business stakeholders, building scalable and well-documented data models that power reporting, dashboards, and downstream analytics. Prior experience with Guidewire is a plus, but we value strong analytics engineering fundamentals and the ability to learn complex domains.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field, or equivalent experience.
  • 3+ years in analytics engineering, data engineering, or data warehousing roles.
  • Hands-on experience with dbt in a production environment.
  • Strong experience working with Snowflake and SQL Server.
  • Experience working in Agile/Scrum teams, participating in sprint ceremonies and iterative delivery.
  • Proficient with Jira, Confluence or similar tools.
  • Advanced SQL and Python skills.
  • Experience with ELT/ETL-based data transformation patterns.
  • Familiarity with Git-based version control and CI/CD pipelines.
  • Experience supporting BI tools such as Qlik, Power BI, or similar.
  • Experience participating in a data platform, analytics, or enterprise transformation initiative, such as migrating from legacy data warehouses or ETL tools to a modern cloud-based analytics stack.
  • Experience supporting the transition of stakeholders and downstream consumers to new data models, tools, or reporting paradigms.
  • Strong analytical and problem-solving skills.
  • Ability to communicate effectively with both technical and non-technical stakeholders.
  • Organized, detail-oriented, and able to manage multiple priorities.

Nice To Haves

  • Experience with Guidewire Cloud Data Access (CDA) and Guidewire PolicyCenter data models.
  • Background in property & casualty insurance.
  • Experience with cloud platforms (Azure preferred).
  • Exposure to AI-enabled data platform features (e.g., Snowflake Cortex) or participation in analytics projects that support machine learning or generative AI use cases.
  • Familiarity with preparing, curating, and governing data for AI-driven analytics.
  • Familiarity with data governance, metadata management, or data cataloging practices.

Responsibilities

  • Analytics Engineering & ELT/ETL Development: Design, build, and maintain data ingestion and transformation pipelines using a combination of Python-based ETL workflows, OpenFlow, Snowflake-native ELT patterns and SQL Server stored procedures as part of an ongoing EDW migration to Snowflake.
  • Develop, test, and maintain dbt models to transform raw and prepared data into analytics-ready, consumable data models in Snowflake.
  • Implement and enforce analytics engineering best practices, including modular modeling, testing, documentation, and version control.
  • Optimize Snowflake models for performance, scalability, and cost efficiency.
  • Data Modeling & Architecture: Design and maintain dimensional and analytics-friendly data models (facts, dimensions, marts).
  • Partner with Data Architect and Data Engineers to align analytics models with enterprise data standards.
  • Support ongoing evolution of the enterprise data warehouse and curated data layers.
  • Snowflake Platform Development Work directly in Snowflake to develop ELT logic, optimize queries, and manage data structures.
  • Leverage Snowflake features such as tasks, streams, views, and secure data access where appropriate.
  • Data Quality, Governance & DevOps Implement data quality tests and validation logic within dbt.
  • Participate in CI/CD workflows for analytics code using Git and automated deployments.
  • Ensure adherence to security, compliance, and data governance standards, especially for regulated insurance data.
  • Working Style & Delivery Experience Experience working as part of a Scrum or Agile delivery team, collaborating with product owners, analysts, and engineers.
  • Familiarity with agile ceremonies such as sprint planning, stand-ups, backlog grooming, and retrospectives.
  • Experience using Jira or similar work management tools (e.g., Azure DevOps, Confluence) to track work, manage backlogs, and document requirements.
  • Ability to manage work through user stories, tasks, and sprint commitments, delivering incremental, high-quality data assets.
  • Customer Support Provide customer assistance with reporting tools, software, and reports.
  • Actively seeking out relationships with the customer community, proactively communicating relevant information and discussing their needs, initiatives, and service levels through periodic meetings.

Benefits

  • 401(k) with company match
  • Paid Time Off
  • Sick Leave
  • Medical
  • Health Reimbursement Arrangement (HRA)
  • Telemedicine
  • Wellness Program
  • Employee Assistance Program (EAP)
  • Dental
  • Vision
  • Accident & Critical Illness Insurance
  • Flexible Spending Account (FSA)
  • Dependent Care FSA
  • Group and Voluntary Life Insurance
  • Short- and Long-Term Disability
  • Pet Insurance
  • Transit and Parking benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service