Highmark Health-posted 2 months ago
$67,500 - $126,000/Yr
Full-time
5,001-10,000 employees

This is a fully remote position. We are seeking an integral member for our technical team, responsible for supporting the design, development, and maintenance of the organization's data and programming infrastructure, ensuring the efficient and reliable flow of data across various systems. This role requires a strong understanding of data and ETL architecture, cloud-based data solutions (ideally GCP & databricks), Programming languages like Python/PySpark with ETL(Extract Transform load) experience. The ideal candidate will have a proven track record of success in designing, programming and implementing advanced data engineering solutions and possess hands-on experience with the entire software development life cycle (SDLC), with heavy coding experience in relevant languages (Python, PySpark) and deep knowledge of SQL with working knowledge of Tools like Informatica as a plus.

  • Design, develop, and maintain robust data processes and solutions to ensure the efficient movement and transformation of data across multiple systems.
  • Develop and maintain data models, databases, and data warehouses to support business intelligence and analytics needs.
  • Collaborate with stakeholders across IT, product, analytics, and business teams to gather requirements and provide data solutions that meet organizational needs.
  • Monitor work against the production schedule, provide progress updates, and report any issues or technical difficulties to lead developers regularly.
  • Implement and manage data governance practices, ensuring data quality, integrity, and compliance with relevant regulations.
  • Collaborate on the design and implementation of data security measures, including access controls, encryption, and data masking.
  • Perform data analysis and provide insights to support decision-making across various departments.
  • Stay current with industry trends and emerging technologies in data engineering, recommending new tools and best practices as needed.
  • Other duties as assigned or requested.
  • 3 years of experience in design and analysis of algorithms, data structures, and design patterns in the building and deploying of scalable, highly available systems.
  • 3 years of experience in a data engineering, ETL development, or data management role.
  • 3 years of experience in SQL and experience with database technologies (e.g., MySQL, PostgreSQL, MongoDB).
  • 3 years of experience in data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, Redshift, BigQuery).
  • 3 years of experience with data pipeline and workflow management tools (e.g., Apache, GCP Tools, Databricks, PySpark).
  • 3 years of experience building ETL and Data Integration pipelines in Python, PySpark and knowledge of Informatica is a plus.
  • 3 years of experience working with On Prem databases like Oracle, Teradata and DB2.
  • 3 years of experience with Cloud platforms (GCP and Azure) and their respective data services.
  • 3 years of deep SQL experience & Unix experience.
  • 1 year of experience working with a variety of technology systems, designing solutions or developing data solutions in healthcare.
  • 3 years of experience translating requirements, design mockups, prototypes or user stories into technical designs.
  • 3 years of experience with data-related code that is fault-tolerant, efficient, and maintainable.
  • Base pay is determined by a variety of factors including a candidate’s qualifications, experience, and expected contributions, as well as internal peer equity, market, and business considerations.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service