Senior Analytics Engineer

DPL Financial PartnersLouisville, KY
48dHybrid

About The Position

DPL Financial Partners is a leader in annuities distribution, and we're building a modern analytics platform that powers everything from operational reporting to AI-driven insights. This isn't a role where you'll spend your days dragging and dropping to build pixel-perfect dashboards. You'll own entire business domains-from pipeline infrastructure through data modeling to the intelligence layer that business teams rely on daily. We're looking for someone who can go deep on modern data stack technology (Snowflake, dbt, semantic layers) while also understanding the business problems that data is meant to solve. You'll work directly with the CDO and collaborate closely with a small, high-impact analytics team where everyone contributes to shared infrastructure while owning their domain end-to-end. This role has a path to leadership for the right person.

Requirements

  • 5+ years of experience in analytics engineering, data engineering, or a related field
  • Expert SQL skills-you can write complex queries, optimize performance, and debug data issues without hand-holding
  • Strong experience with dbt-you understand the modeling patterns, testing framework, and deployment workflows
  • Experience with Snowflake or a comparable cloud data warehouse (BigQuery, Redshift, Databricks)
  • Proficiency with Python for data processing, automation, or analysis (vibe coding is acceptable)
  • Working knowledge of Git and CI/CD practices for analytics code
  • Demonstrated experience using AI coding assistants to accelerate development-or genuine curiosity and willingness to adopt them
  • Ability to work onsite in Louisville, KY 3-5 days per week (relocation assistance available)

Nice To Haves

  • Experience in insurance, financial services, or annuities distribution
  • Experience with semantic layer tools (dbt Semantic Layer/MetricFlow, Cube, AtScale, Snowflake Semantic Views)
  • Familiarity with Salesforce data models and integration patterns
  • Experience with modern BI tools like Sigma Computing, Hex, or Lightdash
  • Background building AI/ML-powered data products or preparing data for LLM applications, such as Snowflake Intelligence or Streamlit
  • Pipeline orchestration experience (Airflow, Dagster, Prefect, Matillion, Fivetran, OpenFlow)
  • Experience at an enterprise software company or SaaS platform-you understand how data teams support product and go-to-market at scale

Responsibilities

  • Build and maintain dbt models that transform raw data from Salesforce, NetSuite, DTCC, and internal systems into business-ready datasets
  • Design and implement semantic layer definitions (metrics, dimensions, entities) that power both BI tools and AI/LLM interfaces
  • Own data quality for your domain-implement tests, monitoring, and alerting that catch issues before stakeholders do
  • Partner directly with business stakeholders to understand their problems and translate them into analytics solutions
  • Use AI coding assistants (GitHub Copilot, Claude, dbt Copilot) throughout the development lifecycle-we expect you to leverage these tools to move faster and maintain quality
  • Contribute to shared infrastructure: pipeline orchestration, CI/CD, observability, governance
  • Develop data products that solve real business problems-this could include AI-powered document processing, self-service analytics tools, or embedded partner BI

Benefits

  • Competitive salary commensurate with experience
  • Comprehensive health, dental, and vision insurance
  • 401(k) with company match
  • Relocation assistance available
  • Professional development budget

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

51-100 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service