About The Position

The Senior Data Engineer is responsible for designing and developing modern, data-centric applications that support clinical and operational workflows across the healthcare system. These solutions leverage cloud technologies, big data platforms, data science methodologies, and contemporary software development frameworks. Key responsibilities include building scalable data pipelines, transformation and enrichment processes, provisioning layers, and intuitive user interfaces aligned with strategic initiatives. This role also involves mentoring junior Data Engineers and contributing to the research and adoption of emerging technologies to establish repeatable processes and templates. Collaboration is a core value, with a strong emphasis on thorough source control and documentation. The ideal candidate approaches complex challenges with simplicity, utilizing modern tools and techniques to deliver efficient solutions. This position works closely with Product, Platform, and Architecture teams to ensure alignment and successful delivery of joint initiatives. In this role you will own the reliability, security, and cost-efficiency of Providence's enterprise data platform. You will automate and monitor production pipelines, resolve incidents and partner with engineering and analytics teams to meet strict SLAs.

Requirements

  • Bachelor's Degree in Computer Engineering, Computer Science, Mathematics, Engineering -OR- a combination of equivalent education and work experience
  • 5 or more years of related experience

Nice To Haves

  • Master's degree in Computer Engineering, Computer Science, Mathematics, Engineering -OR- a combination of equivalent education and work experience
  • 5 or more years of data warehousing experience
  • Strong SQL experience
  • Hands-on Python and shell scripting (Bash/PowerShell) on Linux
  • Experience with cloud computing and data warehousing on relational platforms
  • Solid understanding of OLAP/dimensional modeling and RDBMS fundamentals
  • Knowledge of semi/unstructured data (JSON/Parquet) and API integrations
  • Familiarity with open-source ELT/ETL tooling
  • Git-based workflows and CI/CD basics for data deployments

Responsibilities

  • Automation and monitoring batch/streaming ELT/ETL pipelines across cloud services and Snowflake; keep data fresh and accurate
  • Triaging failures, performing root-cause analyses and post-mortems, and continuously reducing MTTR
  • Orchestrating schedules and dependencies (e.g., Ctrl-M) and managing connectors (in Fivetran/HVR)
  • Building operational automation with SQL, Python, and shell (Bash/PowerShell) on Linux
  • Implementing observability: logging, alerting, data-quality checks, audits, and runbooks
  • Optimizing Snowflake performance and spend (warehouses, caching)
  • Enforcing access controls and secrets management, supporting compliance and audit needs
  • Partnering with product/analytics teams; documenting procedures and providing reliable on-call support (rotational)

Benefits

  • 401(k) Savings Plan with employer matching
  • Health care benefits (medical, dental, vision)
  • Life insurance
  • Disability insurance
  • Time off benefits (paid parental leave, vacations, holidays, health issues)
  • Voluntary benefits
  • Well-being resources
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service