Senior Data Engineer

SimplePractice
1d

About The Position

We're looking for a Senior Data Engineer to help lead the evolution of our data stack—from pipelines to platform. In this role, you'll build the infrastructure that powers everything from product intelligence to financial reporting and self-serve analytics in product Our customers are clinicians in small to mid-sized private practices, and the data you shape will directly help them run more efficient, effective businesses. By enabling accessible insights—from no-show trends to client engagement—you’ll give practitioners the tools to make smarter, faster decisions about how they manage care. Internally, your work will fuel everything from executive reporting and revenue forecasting to product analytics and AI evaluation loops. You’ll collaborate across Data Analytics, ML, Product, and Engineering teams to build scalable, reliable systems that make data a first-class product at SimplePractice.

Requirements

  • BS/MS in Engineering, Computer Science, Mathematics, or related field
  • 7+ years in Data or Analytics Engineering
  • Strong problem-solving and communication skills; comfortable in fast-paced, cross-functional environments
  • Enterprise architecture and enterprise data architecture (data modeling and enterprise dimensional modeling)
  • Expert in SQL and data modeling (relational, dimensional, semantic)
  • Proven experience in data warehouse design, implementation, and maintenance (Snowflake)
  • Hands-on with DBT for modular, testable transformations
  • Experience with orchestration and ingestion tools: Airflow, Prefect, Airbyte, Fivetran, Kafka
  • Familiar with ELT, schema-on-read, DAGs, and performance optimization
  • Experience with AWS (S3, RDS, Redshift, etc.)
  • Skilled in handling structured, semi-structured (e.g., JSON), and columnar formats (e.g., Parquet, ORC)
  • Experience building and supporting semantic layers for self-serve analytics
  • Proficient with BI tools like Looker, Tableau, or Sisense
  • Comfortable standardizing metrics and enabling trusted, consistent access to data
  • Proficient in Python and Unix/Linux scripting
  • Comfortable working with APIs (e.g., using curl)

Nice To Haves

  • AWS DevOps - Terraform, Kubernetes, Docker
  • Project & Change Management skills especially experience working in an Agile (SCRUM, Kanban) environment/team focusing on sprint by sprint deliveries
  • Real-time ETL - Kafka streaming, AWS Kinesis

Responsibilities

  • Partner with Product, Analytics and Engineering to build scalable systems that help unlock the value of data from a wide range of sources such as backend databases, event streams, and marketing platforms
  • Lead technical vision and architecture with holistic point of view on both short-term and long-term horizons
  • Work with analytics to create company wide alignment through standardized metrics across the company
  • Work with Product and Engineering teams to support internal use cases such as financial reporting, product analytics and operational metrics
  • Enable external use cases like customer-facing dashboard, self-serve analytics, and next best action in product
  • Manage the complete data stack from ingestion through data consumption
  • Build tools to increase transparency in reporting company wide business outcomes
  • Work with DevOps to deploy and maintain data solutions leveraging cloud data technologies, preferably in AWS
  • Help define data quality and data security framework to measure and monitor data quality across the enterprise. Define and promote data engineering best practice

Benefits

  • Medical, dental, vision, life & disability insurance
  • 401(k) plan with company match
  • Flexible Time Off (FTO), wellbeing days, paid holidays, and summer Fridays
  • Mental health resources
  • Paid parental leave & Backup Care
  • Tuition reimbursement
  • Employee Resource Groups (ERGs)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service