Allstate-posted 3 days ago
Full-time • Mid Level

At Allstate, great things happen when our people work together to protect families and their belongings from life’s uncertainties. And for more than 90 years, our innovative drive has kept us a step ahead of our customers’ evolving needs. From advocating for seat belts, air bags and graduated driving laws, to being an industry leader in pricing sophistication, telematics, and, more recently, device and identity protection. Job Description The Property Claims Big Data Engineer Senior Consultant II will play a key technical role in advancing Property Claims’ cloud-first data modernization journey across Microsoft Fabric, Microsoft Azure, and high-performance on-premise compute environments. This role is responsible for engineering robust, scalable, and secure data solutions that power real-time decisioning, operational reporting, advanced analytics, and vendor integrations across the Property line of business. This includes: -Designing, deploying, and maintaining Azure cloud resources using Infrastructure-as-Code (Terraform/Env0) -Engineering modern data ingestion, ETL/ELT pipelines, and event-streaming solutions across ADLS Gen2, Event Hub, Azure SQL MI, Snowflake Reader accounts, and OneLake/Fabric -Supporting and enhancing high-performance on-prem Linux ETL pipelines that integrate with Oracle, SQL Server, Snowflake, AWS S3, ADLS, OneLake, and vendor APIs -Developing code and automations that enable descriptive, predictive, and prescriptive analytics, as well as real-time Property Claims operational insights The ideal candidate brings strong engineering expertise or proven ability to quickly ramp up in both cloud and on-prem ecosystems, demonstrates deep curiosity and problem-solving capability, and thrives in a fast-moving hybrid technical organization.

  • Cloud Engineering (Azure + Microsoft Fabric) Design, deploy, and maintain Azure cloud resources including ADLS Gen2 containers, Event Hub namespaces/instances, Function Apps, networking components, and Azure SQL Managed Instances.
  • Build and maintain infrastructure-as-code deployments using Terraform and Env0 with GitHub-based CI/CD.
  • Develop and support Fabric-native integrations, including Lakehouse ingestion, shortcuts, pipelines, and real-time event streaming.
  • Engineer secure cloud-to-cloud vendor integrations (e.g., Verisk, Cotality, ClaimXperience, XactAnalysis) using OAuth, event streaming, and managed file transfer pipelines.
  • On-Premise Engineering (High-Performance Linux Environment) Support and enhance a high-performance on-prem Linux server used for ETL, automation, and real-time data extraction/processing.
  • Develop and maintain Python ETL pipelines orchestrated through Airflow that interact with Oracle, SQL Server, Snowflake, AWS S3, ADLS, OneLake, and SFTP/PGP sources.
  • Build reusable Python libraries and automation patterns for: API integrations File ingestion and transformation Scheduled data flows and cross-system orchestration Ensure reliability, monitoring, and performance tuning of on-prem data services.
  • Data Engineering & Analytics Enablement Engineer robust Big Data solutions to support descriptive analytics, automated reporting, predictive modeling, and real-time decision analytics for Property Claims.
  • Identify and integrate new data sources, patterns, and technologies that can improve operational insights and analytic capabilities.
  • Partner with internal peers, Allstate Technology, and vendor engineering teams to deliver end-to-end solutions across systems.
  • Drive execution of complex technical workstreams and contribute specialized engineering expertise to mission-critical projects.
  • Strong coding skills in Python and SQL (any dialect; experience with multiple databases preferred)
  • Demonstrated experience or ability to quickly upskill in: Azure services (Event Hub, ADLS, Azure SQL MI, Function Apps, Networking) Infrastructure as Code (Terraform) and CI/CD workflows via GitHub/Env0 Microsoft Fabric (Lakehouse, Pipelines, Real-Time Analytics, OneLake shortcuts)
  • Experience building ETL/ELT pipelines across multiple data platforms
  • Experience working with APIs, file-based ingestion, or vendor integrations
  • Ability to operate effectively in a hybrid cloud/on-prem environment
  • Strong analytical, problem-solving, and communication abilities
  • 3+ years of experience in data engineering, cloud engineering, ETL architecture, or similar fields
  • Experience with any of the following: Airflow Snowflake Oracle Azure Event Hub or Kafka PGP/SFTP automation Real-time event streaming pipelines
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service