Data Architect Senior - Clinical

Intermountain HealthLake Park, IA
1dHybrid

About The Position

Provide leadership in developing data marts, designing data models, implementing ETL processes, and ensuring data usability for clients. Lead the physical design and implementation of databases for complex projects, optimizing data presentation for reporting tools. Develop and maintain standards for data warehouse elements, such as architectures, models, tools, and databases ensuring it aligns with best practices. Provide troubleshooting support for data warehouses and address technical issues promptly. Design, implement, and operate data warehouse systems to balance optimization of data access and resource utilization to meet performance requirements. Ensure successful delivery of data models, pipelines, and warehouses overseeing end-to-end delivery of data solutions across cloud and on-premise environments. Perform complex system and data analysis, using SQL, PLSQL, Python, and other languages to develop solutions through advanced programming and analysis. Build robust processes to streamline data flow by developing data warehouse process models, including sourcing, loading, transformation, and extraction, to support data integration pipelines. Create and manage supporting documentation, including metadata, diagrams of entity relationships, business processes, and process flows, for governance and reference. Ensure proper integration of data with existing standards and business processes to support smooth operations, maintaining consistency across systems. Create plans, test files, and scripts for data warehouse testing, including unit and integration testing to ensure system readiness. Develop, document, and deliver complex ad-hoc queries, consulting with business users to determine needs, feasibility, and priorities. Develop and implement ETL processes and map data between source systems, data warehouses, and data marts, ensuring smooth data flow across them, while verifying the structure, accuracy, and quality of warehouse data. This is a hybrid position requiring in-person presence once per month. Position may telecommute from any state in the United States except California, Connecticut, Hawaii, Illinois, New York, Rhode Island, Vermont, and Washington.

Requirements

  • Master’s degree or equivalent foreign education
  • Academic Discipline(s): Computer Science, Information Systems, Bioinformatics, Statistics, or a closely related field
  • 0 years of experience
  • Knowledge of Oracle or Microsoft SQL Server databases.
  • Knowledge of Alteryx, Cognos, and Tableau.
  • Knowledge of Architecture and Modeling Tools.
  • Knowledge of Data Integration and Strategies.
  • Knowledge of SQL.
  • Knowledge of R and Python.
  • Knowledge of Data Warehouse Concepts, Models, Cohorts, Metrics and KPIs.
  • Knowledge of Informatica Intelligent Cloud Services and PowerCenter, and BMC – Control-M.
  • Knowledge Azure Databricks.
  • Knowledge of EPIC Clarity and Caboodle.
  • Knowledge of EPIC Cogito.
  • Knowledge of EPIC Caboodle Development.
  • Knowledge of EPIC Clinical Data Model.
  • Upon hire, all applicants will be subject to drug testing/screening and background checks.

Responsibilities

  • Provide leadership in developing data marts, designing data models, implementing ETL processes, and ensuring data usability for clients.
  • Lead the physical design and implementation of databases for complex projects, optimizing data presentation for reporting tools.
  • Develop and maintain standards for data warehouse elements, such as architectures, models, tools, and databases ensuring it aligns with best practices.
  • Provide troubleshooting support for data warehouses and address technical issues promptly.
  • Design, implement, and operate data warehouse systems to balance optimization of data access and resource utilization to meet performance requirements.
  • Ensure successful delivery of data models, pipelines, and warehouses overseeing end-to-end delivery of data solutions across cloud and on-premise environments.
  • Perform complex system and data analysis, using SQL, PLSQL, Python, and other languages to develop solutions through advanced programming and analysis.
  • Build robust processes to streamline data flow by developing data warehouse process models, including sourcing, loading, transformation, and extraction, to support data integration pipelines.
  • Create and manage supporting documentation, including metadata, diagrams of entity relationships, business processes, and process flows, for governance and reference.
  • Ensure proper integration of data with existing standards and business processes to support smooth operations, maintaining consistency across systems.
  • Create plans, test files, and scripts for data warehouse testing, including unit and integration testing to ensure system readiness.
  • Develop, document, and deliver complex ad-hoc queries, consulting with business users to determine needs, feasibility, and priorities.
  • Develop and implement ETL processes and map data between source systems, data warehouses, and data marts, ensuring smooth data flow across them, while verifying the structure, accuracy, and quality of warehouse data.

Benefits

  • We care about your well-being – mind, body, and spirit – which is why we provide our caregivers a generous benefits package that covers a wide range of programs to foster a sustainable culture of wellness that encompasses living healthy, happy, secure, connected, and engaged.
  • Learn more about our comprehensive benefits package here.
  • Intermountain Health’s PEAK program supports caregivers in the pursuit of their education goals and career aspirations by providing up-front tuition coverage paid directly to the academic institution. The program offers 100+ learning options to choose from, including undergraduate studies, high school diplomas, and professional skills and certificates. Caregivers are eligible to participate in PEAK on day 1 of employment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service