Staff Software Engineer, Data Platform Lead (AI / Insurtech)
EvolutionIQ
·
Posted:
August 30, 2023
·
Onsite
About the position
We are seeking a Lead Engineer for our Data Platforms who will be responsible for securing, architecting, and managing our sensitive insurance data. This role involves overseeing foundational datasets, data models, and analytics, as well as ensuring the robustness, reliability, and security of our data systems. The ideal candidate will have experience in creating and managing secure data platforms, a strong engineering background, and a track record of technical leadership. They will play a crucial role in safeguarding our information assets and shaping the future of our data platform architecture.
Responsibilities
- Architect, design, and implement robust, secure, scalable, and high-quality data platforms, ensuring the availability, integrity, and confidentiality of the information.
- Lead the development and maintenance of data pipelines, including personally coding and building the most critical components.
- Work closely with product engineers, data scientists, analysts, and other stakeholders to understand data needs and deliver on those needs.
- Define, design, and improve foundational data models to be used across the company to enable feature development and analytics.
- Continuously improve our data quality toolkit.
- Provide guidance and technical leadership to the data engineering team, promoting continual team growth and individual team member skill development.
- Be a role model for all engineers and provide mentorship as needed.
- Drive proof of concepts and experiments to explore new technologies that can level up the entire organization.
Requirements
- 7+ years of industry experience, holding staff/principal/lead level roles in Software Engineer or Data Engineer, with a focus in building scalable, mission critical, data platforms
- Strong written and verbal communication skills
- Extensive Python development experience
- Experience with distributed data/computing tools, such as: Spark, Airflow, dbt
- Proven track record of establishing engineering best practices for both coding and architecture
- Experience building out systems and processes to enable secure handling of highly sensitive data
- Experience using modern big data storage technologies such as Apache Parquet or Avro
- Strong familiarity with modern data warehouse such as BigQuery or Snowflake
- Ambitious, collaborative, and empathetic values
- At least 3+ years experience in deploying systems on GCP or AWS (Even Better if You Have)
- Experience with MLOps, such as feature engineering and model serving (Even Better if You Have)
- Experience with Dagster/Airflow, BigQuery, GCP, Terraform, Kubernetes (Even Better if You Have)
Benefits
- Compensation: The range is $210-240K depending on a candidate’s background and experience.
- Well-Being: Full medical, dental, vision, short- & long-term disability, 401k matching. 100% of the employee contribution up to 3% and 50% of the next 2%.
- Work/Life Balance: Regular work from the NYC office with flexibility, flexible vacation policy, and winter break closure.
- Home & Family: Flexible PTO, 100% paid parental leave (4 months for primary caregivers and 3 months for secondary caregivers), sick days, paid time off, flexible schedule for new parents, and sleep training for newborns.
- Office Life: Catered lunches, happy hours, pet-friendly office space, $500 for in-home office setup, and $200/year for upgrades.
- Growth & Training: $1,000/year for professional development and upskilling opportunities.
- Sponsorship: Open to sponsoring candidates currently in the U.S. who need to transfer their active H1-B visa.