Analytics Engineer

PURE InsuranceSan Diego, CA
1d$95,000 - $115,000

About The Position

As an Analytics Engineer, you'll be a key player in our data architecture transformation. Your SQL expertise will be the foundation upon which we build our new medallion-style data architecture. You'll be working with cutting-edge tools like Databricks and dbt, turning our data into a powerful, unified ecosystem that drives business decisions.

Requirements

  • Bachelor's degree in Computer Science, Mathematics, Statistics, or a related field
  • Minimum of 2 years of analytics engineering or similar experience, with a strong focus on SQL
  • Proven track record of solving complex data challenges using advanced SQL techniques
  • Ability to communicate technical concepts clearly to both technical and non-technical audiences
  • Working knowledge of data modeling concepts and best practices

Nice To Haves

  • Experience with dbt, demonstrating ability to quickly adapt to modern data tools
  • Proficiency in Python or other programming languages used in data analysis
  • Familiarity with Databricks or similar big data platforms
  • Knowledge of insurance industry data and regulatory requirements
  • Experience BI tools (e.g. Hex, Tableau, PowerBI ,or similar) showcasing your end-to-end data skills
  • Understanding of semantic layer implementation and benefits

Responsibilities

  • Contribute to the development of our new medallion-style data architecture in Databricks, translating your analytical expertise into robust data models
  • Build out PURE’s data dictionary by creating high-quality user-friendly documentation for new and refactored data pipelines
  • Harness the power of dbt to unify and standardize hundreds of regulatory data calls, building out the "platinum" layer of our data warehouse and ensuring data consistency across the organization
  • Collaborate with cross-functional teams to deeply understand business needs, translating them into elegant technical solutions that drive decision-making
  • Implement data models, ETL processes, and data quality checks, ensuring data integrity and reliability
  • Optimize data pipelines for performance and scalability, allowing for faster and more efficient data analysis
  • Work with cutting-edge tools such as Databricks, dbt, Hex, and semantic layer technologies, expanding your technical toolkit
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service