Senior Data Engineer

KemperChicago, IL
22h$99,000 - $164,800Hybrid

About The Position

Kemper is one of the nation’s leading specialized insurers. Our success is a direct reflection of the talented and diverse people who make a positive difference in the lives of our customers every day. We believe a high-performing culture, valuable opportunities for personal development and professional challenge , and a healthy work-life balance can be highly motivating and productive. Kemper’s products and services are making a real difference to our customers, who have unique and evolving needs. By joining our team, you are helping to provide an experience to our stakeholders that delivers on our promises. Position Summary: We are seeking a highly skilled Senior Data Engineer to guide the design, development, and delivery of our enterprise data platforms. This role requires deep technical expertise with Snowflake, Core AWS Services , cloud-based architecture, and modern data engineering frameworks. As a senior engineer , you will define architectural patterns, mentor engineering teams, and ensure the successful execution of strategic data initiatives that support analytics, reporting, and business intelligence efforts across the organization.

Requirements

  • 8+ years of experience in data engineering, with 2+ years in a senior , lead, or architect capacity.
  • Advanced hands-on expertise in Snowflake, including constant learning on advanced new features.
  • Strong experience implementing Data Vault 2.0 models and automated ELT frameworks.
  • Proficiency with ETL/ELT and data integration tools such as Informatica IDMC, PowerCenter, Pentaho, AWS Glue, Control M and Python-based pipelines.
  • Deep understanding of: Data warehousing and analytics engineering principles Dimensional modeling and relational structures ELT patterns, CDC frameworks, and metadata-driven design Orchestration frameworks (Control-M, Airflow)
  • Strong SQL expertise with experience in stored procedures, Snowflake Scripting, and complex query optimization.
  • Hands-on experience designing and delivering large-scale AWS-based data platforms.
  • Demonstrated leadership in guiding teams, influencing architecture decisions, and managing technical delivery.
  • Excellent communication and collaboration skills with the ability to partner across technical and business teams.
  • Bachelor’s degree in Computer Science , Information Systems, Engineering, or equivalent relevant experience/certifications.

Nice To Haves

  • Experience with advanced Snowflake features: Snowpark (Python/Java/Scala) Snowflake Warehouses, Databases, Schemas, RBAC Snowpipe , Tasks, Streams, Stages, File Formats Performance tuning, warehouse optimization, clustering Data sharing, secure data sharing, reader accounts A I /ML Function
  • Experience with modern DataOps , DevOps, and automation frameworks: CI/CD (GitHub Actions, GitLab CI, Azure DevOps) Infrastructure - as-Code ( Terraform , CloudFormation ) Automated testing frameworks for data pipelines
  • Familiarity with data quality and governance tools such as Great Expectations, Monte Carlo, Datafold , Soda, Alation, or Collibra.
  • Experience with streaming technologies such as Kafka, Kinesis, or MSK.
  • Exposure to ML feature engineering pipelines or MLOps practices.
  • P& C Insurance industry experience, specifically auto insurance experience and is strongly preferred.
  • Prior experience on data warehousing with Guidewire PolicyCenter , ClaimCenter or BillingCenter and Agency Licensing data is preferred

Responsibilities

  • Technical Leadership & Architecture Provide technical leadership to data engineers, setting standards for solution design, coding practices, data governance, and quality.
  • Define and evolve the enterprise data architecture leveraging Snowflake, Data Vault 2.0, and modern event driven and ELT frameworks.
  • Lead architecture reviews, establish engineering best practices, and guide platform modernization efforts.
  • Mentor engineering team members, facilitate code reviews, and promote continuous learning and innovation.
  • Solution Design & Data Engineering Architect and oversee the delivery of scalable data pipelines, ingestion frameworks, and transformation processes using Snowflake, Python, Spark, Informatica (PowerCenter/IDMC), AWS Glue, and cloud-native tooling.
  • Design and maintain enterprise data models, including Data Vault 2.0 (Hubs, Links, Satellites, PIT, Bridge structures) and dimensional models.
  • Develop and optimize Snowflake platform capabilities, including: Snowpipe , Streams, Tasks, File Formats, External Stages Dynamic Tables and ELT pipeline automation Warehouse sizing, performance tuning, and cost optimization Implement scalable batch, micro-batch, and real-time ingestion solutions following best practices.
  • Stakeholder Engagement & Delivery Translate complex business requirements into technical design specifications and actionable development plans.
  • Partner with product owners, business stakeholders, architects, and analytics teams to deliver high-impact data solutions.
  • Manage technical execution across multiple initiatives, ensuring alignment with enterprise priorities, data strategies, and delivery timelines.
  • Oversee documentation, deployment readiness, testing processes, and quality assurance for production releases.
  • Operational Excellence & Production Support Ensure operational reliability, data accuracy, and performance of enterprise data warehouse and analytical environments.
  • Lead root-cause analysis for production issues and drive implementation of long-term preventative solutions.
  • Implement robust monitoring frameworks using tools such as Snowflake Resource Monitors, CloudWatch, Datadog, or equivalent observability platforms.
  • Conduct performance optimization on Snowflake workloads, SQL queries, pipelines, and integrations.

Benefits

  • This job is eligible for an annual discretionary bonus and Kemper benefits (Medical, Dental, Vision, PTO, 401k, etc.)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service