HCA Healthcare-posted 3 days ago
Mid Level
Onsite • Nashville, TN
5,001-10,000 employees

Position: Senior Big Data Engineer (Multiple Positions) Employer: HCA Management Services LP Worksite : 2555 Park Plaza, Nashville, TN 37203 DUTIES : Responsible for building and supporting a GCP based ecosystem designed for enterprise-wide analysis of structured, semi­structured, and unstructured data; support regular requests to move data from one cluster to another; manage production support teams to make sure service levels are maintained and any interruption is resolved in a timely fashion; analyze requirements, design AI/ML based solutions, and integrate those solutions for customer environments; closely collaborate with team members to successfully execute development initiatives using Agile practices and principles; lead efforts to design, develop, deploy, and support software systems; collaborate with business analysts, project lead, management and customers on requirements; participate in the deployment, change, configuration, management, administration and maintenance of deployment process and systems; effectively prioritize workload to meet deadlines and work objectives; gather requirements, design, construct and deliver solutions with minimal team interaction; work in an environment with rapidly changing business requirements and priorities; bring new data sources into GCP/HDFS, transform and load to databases; work collaboratively with Data Scientists and business and IT leaders throughout the company to understand Data needs and use cases.

  • building and supporting a GCP based ecosystem designed for enterprise-wide analysis of structured, semi­structured, and unstructured data
  • support regular requests to move data from one cluster to another
  • manage production support teams to make sure service levels are maintained and any interruption is resolved in a timely fashion
  • analyze requirements, design AI/ML based solutions, and integrate those solutions for customer environments
  • closely collaborate with team members to successfully execute development initiatives using Agile practices and principles
  • lead efforts to design, develop, deploy, and support software systems
  • collaborate with business analysts, project lead, management and customers on requirements
  • participate in the deployment, change, configuration, management, administration and maintenance of deployment process and systems
  • effectively prioritize workload to meet deadlines and work objectives
  • gather requirements, design, construct and deliver solutions with minimal team interaction
  • work in an environment with rapidly changing business requirements and priorities
  • bring new data sources into GCP/HDFS, transform and load to databases
  • work collaboratively with Data Scientists and business and IT leaders throughout the company to understand Data needs and use cases
  • Bachelor's degree (U.S. or foreign equivalent) in Computer Science or a related field
  • five (5) years of IT experience
  • three (3) years of Data Engineer experience
  • one (1) year experience with GCP platform
  • one (1) year experience with GCP Services and Hadoop application design and implementation
  • one (1) year experience with GKE, BQ, Dataflow, PubSub, Streaming, Java, Python, Scala, SQL, JSON, Avro, Parquet, and Kafka
  • one (1) year experience with NoSQL or RDBMS databases
  • one (1) year experience with CI, CD, Git, and deployment process
  • one (1) year experience deploying Big Data Technologies to Production
  • one (1) year experience with agile application development, file systems management, and DevOps discipline
  • one (1) year experience with short-cycle iterations
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service