Data Engineer - GCP API Developer

CapgeminiNashville, TN
1d

About The Position

We are seeking a GCP FHIR API Staff Data Engineer to design, develop, maintain, and support custom-built HL7 FHIR APIs on Google Cloud Platform. This role will focus on reading large-scale clinical data from Google Cloud Bigtable, transforming it into FHIR-compliant resources, and exposing it through high-performance, secure REST APIs for downstream consumers. This role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. This is a hands-on engineering role requiring deep experience in custom API development, healthcare data standards, and GCP-native services, with responsibility across the full API lifecycle—from design through production support. This role requires ‘self-starters’ who are proficient in API problem solving and capable of bringing clarity to complex situations. Due to the emerging and fast-evolving nature of GCP technology and practice, the position requires that one stay well-informed of technological advancements and be proficient at putting new innovations into effective practice. As a Staff Data Engineer, you will collaborate closely with all team members to create a modular, scalable solution that addresses current needs, but will also serve as a foundation for future success.

Requirements

  • 5+ years of hands-on experience in API development or data engineering
  • Strong experience building custom HL7 FHIR APIs
  • Proven experience working with Google Cloud Bigtable, Cloud Run, GKE, Cloud Functions
  • Strong understanding of FHIR resources and search semantics
  • Proficiency in Java, Python, or Scala
  • Experience with RESTful API design, pagination, filtering, and versioning
  • Experience with OAuth 2.0, API gateways, and IAM
  • Strong debugging and production support experience

Nice To Haves

  • Experience in healthcare, payer, or provider environments
  • Experience with Docker and Kubernetes (GKE)
  • Experience defining custom FHIR profiles and extensions.
  • Prior experience supporting high-volume, consumer-facing APIs.
  • Hands-on experience with GCP platform and experience with many of the following components: Spark Streaming, Kafka, Pub/Sub, BigQuery, Dataflow Cloud Composer, DataProc, GitHub, CI/CD, Cloud Logging RDBMS – MS SQL Server/Oracle/Teradata/Oracle NoSQL, HBase, Cassandra, MongoDB, In-memory, Columnar, other emerging technologies
  • GCP Cloud Professional Data Engineer

Responsibilities

  • Adheres to and supports data engineering API best practices, processes, and standards.
  • Produce high quality, modular, reusable code that incorporates best practices and serves as an example for less experienced engineers.
  • Build productive and healthy relationships within the department and other teams to foster growth of our culture, our people, and our platforms.
  • Works in an environment with rapidly changing business requirements and priorities
  • Shares knowledge and experience to contribute to growth of overall team capabilities.
  • Actively participate in technical group discussions and adopt any modern technologies to improve the development and operations.
  • Self-directed, hands-on engineer
  • Comfortable working in regulated healthcare environments
  • Strong communicator with both technical and business stakeholders

Benefits

  • Paid time off based on employee grade (A-F), defined by policy: Vacation: 12-25 days, depending on grade, Company paid holidays, Personal Days, Sick Leave
  • Medical, dental, and vision coverage (or provincial healthcare coordination in Canada)
  • Retirement savings plans (e.g., 401(k) in the U.S., RRSP in Canada)
  • Life and disability insurance
  • Employee assistance programs
  • Other benefits as provided by local policy and eligibility
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service