Manager

EXL Talent Acquisition Team
67d

About The Position

Responsible for designing, constructing, installing, and maintaining large-scale processing systems and other infrastructure. Ensure that data, whether structured or unstructured, is easily accessible and usable to analysts. Build ETL tools, migrating legacy systems to modern data ecosystems, and handling FHIR resources in healthcare data environments. Design data pipelines, optimizing data processing, and deliver actionable insights. Manage GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud storage to deliver business-critical insights. Possess strong cloud-based data engineering, hands-on experience with GCP services. Optimize existing workflows for performance, scalability and cost-efficiency. Design and implement data pipelines using tools like Apache Beam, Dataflow or Cloud Composer (Airflow). Develop, optimize and manage large-scale ETL/ELT workflow and processes on GCP. Utilize BigQuery for data warehousing and analytics, writing complex SQL queries for reporting and analysis. Build and maintain real-time data streaming solution using Pub/Sub and Dataflow. Implement best practices for data security, governance and compliance (IAM roles, encryption). Manage and maintain GCP storage systems like Cloud Storage, ensuring high availability and scalability. Monitor and troubleshoot data pipelines and workflows, ensuring reliability and performance.

Requirements

  • Strong cloud-based data engineering experience.
  • Hands-on experience with GCP services.
  • Experience in designing and implementing data pipelines.
  • Proficiency in SQL for data analysis and reporting.
  • Experience with ETL/ELT processes.

Nice To Haves

  • Familiarity with Apache Beam, Dataflow, or Cloud Composer (Airflow).
  • Experience with FHIR resources in healthcare data environments.

Responsibilities

  • Design, construct, install, and maintain large-scale processing systems and infrastructure.
  • Ensure data accessibility and usability for analysts.
  • Build ETL tools and migrate legacy systems to modern data ecosystems.
  • Handle FHIR resources in healthcare data environments.
  • Design data pipelines and optimize data processing.
  • Manage GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud storage.
  • Optimize existing workflows for performance, scalability, and cost-efficiency.
  • Develop, optimize, and manage large-scale ETL/ELT workflows on GCP.
  • Utilize BigQuery for data warehousing and analytics.
  • Write complex SQL queries for reporting and analysis.
  • Build and maintain real-time data streaming solutions using Pub/Sub and Dataflow.
  • Implement best practices for data security, governance, and compliance.
  • Monitor and troubleshoot data pipelines and workflows.

Benefits

  • Telecommuting permitted.
  • 5% domestic travel to client sites.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service