Senior Data Platform Engineer

Inspiren
4d$180,000 - $200,000Remote

About The Position

From our realtime monitoring platform to deeper longitudinal insights, data is at the heart of everything Inspiren does. We are seeking a highly skilled Senior Data Platform Engineer to own, maintain, and develop our data infrastructure built on Databricks and AWS to scale and accelerate our data capabilities across the company.

Requirements

  • Bachelor's or Master's degree in Computer Science, or a related engineering field.
  • 5+ years of experience in data infrastructure, platform engineering, data engineering, or similar.
  • Cloud Providers: Demonstrated hands-on experience with Databricks. AWS experience is a plus but not required.
  • Data Warehousing: Expertise in modern data warehouse and lakehouse architectures.
  • Data Pipelines: Expertise in modern ETL technologies and building and supporting data pipelines at scale.
  • Access Control and Management: Familiarity with access control, data governance, and security.
  • Reliability: Demonstrated experience of reliability, observability, and monitoring of infrastructure and systems.
  • DevEx: Experience building and maintaining data platforms with a wide variety of stakeholders/users with a diversity of technical expertises.
  • Scalability: Demonstrated ability to scale cloud-based data workloads, whilst optimizing and monitoring cost
  • Communication: Excellent verbal and written communication skills, with the ability to convey complex ideas clearly.
  • Adaptability: Comfortable working in a fast-paced, dynamic environment and adapting to changing priorities.

Nice To Haves

  • Start-up experience is a plus.
  • Health-tech experience is a plus.

Responsibilities

  • Collaborate with engineering, data science, ML, data engineering, and product analytics teams to understand and shape the future needs of our data platform and infrastructure.
  • Define, drive, and implement the future live ingestion layer of data into our data platform (e.g. Kafka, Kinesis).
  • Define and evolve standards for storage, compute, data management, provenance, and orchestration.
  • Monitor current and forecast future infrastructure needs and costs based on business requirements.
  • Implement and manage role-based access management.
  • Design and implement reusable tooling and frameworks for data consumers (e.g., data quality monitoring frameworks, templated pipelines, foolproof backfilling capabilities).

Benefits

  • equity
  • benefits (including medical, dental, and vision)
  • Flexible PTO
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service