Zinnia-posted 3 months ago
$130,000 - $150,000/Yr
Full-time • Senior
Bridgewater, MA
1,001-5,000 employees
Securities, Commodity Contracts, and Other Financial Investments and Related Activities

Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures. Our success is driven by a commitment to three core values: be bold, team up, deliver value - and that we do. Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders.

  • Build the next-gen data infrastructure for Zinnia using Lakehouse frameworks, Apache Airflow, Apache Spark, and Hive.
  • Design, build, and optimize data workflows across real-time, nearline, and offline data ecosystems.
  • Leverage Lakehouse platforms (Delta Lake, Hudi, Iceberg) to enable unified batch and streaming pipelines.
  • Collaborate with stakeholders and cross-functional teams to understand business requirements and translate them into scalable, data-driven technical solutions.
  • Provide technical expertise in troubleshooting and resolving complex, distributed data-related issues.
  • Stay up to date with Big Data, cloud, and Lakehouse trends, recommending best practices for data engineering and integration.
  • Mentor and guide junior engineers, fostering a culture of innovation, automation, and continuous learning.
  • Bachelor's degree in computer science, Information Technology, or related field.
  • 10+ years of experience in Big Data engineering or a similar role, with proven leadership and project management experience.
  • Strong expertise in data integration, transformation, and orchestration using Spark, Hive, and Airflow.
  • Proficiency in Lakehouse platforms (Delta Lake, Apache Hudi, Apache Iceberg) and data warehousing concepts.
  • Familiarity with cloud-based data environments (AWS, GCP, or Azure).
  • In-depth understanding of scalable data pipelines, distributed computing, and modern data architectures.
  • Programming knowledge in Scala or Python.
  • Knowledge of data quality and governance principles and experience implementing them within the Big Data lifecycle.
  • Excellent communication, leadership, and interpersonal skills, with the ability to collaborate across teams.
  • Proven ability to adapt to changing priorities, manage multiple projects simultaneously, and deliver results in a fast-paced environment.
  • Health/dental insurance
  • Parental leave
  • 401(k)
  • Incentive/bonus opportunity
  • Tuition reimbursement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service