Surescripts-posted 3 months ago
$129,150 - $157,850/Yr
Full-time • Senior
Hybrid • Minneapolis, MN
Computing Infrastructure Providers, Data Processing, Web Hosting, and Related Services

Surescripts serves the nation through simpler, trusted health intelligence sharing, in order to increase patient safety, lower costs and ensure quality care. We deliver insights at critical points of care for better decisions — from streamlining prior authorizations to delivering comprehensive medication histories to facilitating messages between providers. Job Summary: The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining data pipelines and infrastructure using Google Cloud Platform (GCP) BigQuery. The incumbent will collaborate with data analysts, data scientists, and other engineers to ensure timely access to high-quality data for data-driven decision-making across the organization. The Senior Cloud Data Engineer is a highly technical person that has mastered hands-on coding in data processing solutions and scalable data pipelines to support analytics and exploratory analysis. This role ensures new business requirements are decomposed and implemented in the cohesive end-to-end designs that enable data integrity and quality, and best support BI and analytic capability needs that power decision-making at Surescripts. This includes building data acquisition programs that handle the business's growing data volume as part of the Data Lake in GCP BigQuery ecosystem and maintaining a robust data catalog. This is a Senior Data Engineering role within Data & Analytics' Data Core organization working closely with leaders of the Data & Analytics. The incumbent will continually improve the business's data and analytic solutions, processes, and data engineering capabilities. The incumbent embraces industry best practices and trends and, through acquired knowledge, drives process and system improvement opportunities.

  • Design, develop, and implement data pipelines using GCP BigQuery, Dataflow, and Airflow for data ingestion, transformation, and loading.
  • Optimize data pipelines for performance, scalability, and cost-efficiency.
  • Ensure data quality through data cleansing, validation, and monitoring processes.
  • Develop and maintain data models and schemas in BigQuery to support various data analysis needs.
  • Automate data pipeline tasks using scripting languages like Python and tools like Dataflow.
  • Collaborate with data analysts and data scientists to understand data requirements and translate them into technical data solutions.
  • Leverage DevOps Terraform (IaC) to ensure seamless integration of data pipelines with CI/CD workflows.
  • Monitor and troubleshoot data pipelines and infrastructure to identify and resolve issues.
  • Stay up-to-date with the latest advancements in GCP BigQuery and other related technologies.
  • Document data pipelines and technical processes for future reference and knowledge sharing.
  • Bachelor's degree or equivalent experience in Computer Science, Mathematics, Information Technology or related field.
  • 5+ years of solid experience as a data engineer.
  • Strong understanding of data warehousing / datalake concepts and data modeling principles.
  • Proven experience with designing and implementing data pipelines using GCP BigQuery, Dataflow and Airflow.
  • Strong SQL and scripting languages like Python (or similar) skills.
  • Experience with data quality tools and techniques.
  • Ability to work independently and as part of a team.
  • Strong problem-solving and analytical skills.
  • Passion for data and a desire to learn and adapt to new technologies.
  • Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc.
  • Experience with cloud deployment and automation tools like Terraform.
  • Experience with data visualization tools like Tableau or Power BI or Looker.
  • Experience with healthcare data.
  • Familiarity with machine learning, artificial intelligence and data science concepts.
  • Experience with data governance and healthcare PHI data security best practices.
  • Ability to communicate effectively and convey complex technical concepts as well as tasks / project updates.
  • Comprehensive healthcare (including infertility coverage)
  • Generous paid time off including paid childbirth and parental leave and mental health days
  • Pet insurance
  • 401(k) with company match and immediate vesting
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service