Senior Databricks Engineer

Thermo Fisher ScientificRaleigh, NC
1dRemote

About The Position

At Thermo Fisher Scientific team, you’ll discover impactful work, innovative thinking and a culture dedicated to working the right way, for the right reasons - with the customer always top of mind. The work we do matters, like helping customers find cures for cancer, protecting the environment, and supporting our customers’ medical related inquiries. As the world leader in serving science, with the largest investment in R&D in the industry, our colleagues are empowered to realize their full potential as part of a fast-growing, global organization that values passion and unique contributions. Our commitment to our colleagues across the globe is to provide the resources and opportunities they need to make a difference in our world while building a fulfilling career with us. What will you do in this role? The Senior Databricks Engineer will architect, build, and optimize data solutions that support Thermo Fisher Scientific’s digital transformation strategy. As a vital contributor to the CRG Digital Platform and Architecture team, this position will be responsible for building connections and workflows within cloud-based systems. The successful candidate will have extensive knowledge in Databricks, Apache Spark, Snowflake, Iceberg, and SQL, along with a detailed understanding of cloud-native data architectures. This position requires both hands-on technical ability and strategic insight to enable scalable, secure, and impactful data environments across the enterprise.

Requirements

  • Minimum of 8 years professional experience in data engineering or data platform development is required.
  • Minimum of 5 years of hands-on experience with Databricks and Apache Spark in production environments is required.
  • Demonstrated expertise with Snowflake and Apache Iceberg is required.
  • Strong proficiency in SQL and experience optimizing queries on large, distributed datasets is required.
  • Proven experience with cloud-based data platforms (Azure preferred; AWS or GCP acceptable) is required.
  • Strong understanding of data modeling, ETL/ELT pipelines, and data governance practices is required.
  • Experience implementing Unity Catalog or CI/CD pipelines for data workflows is preferred.
  • Strong interpersonal and communication skills with ability to collaborate across global and multi-disciplinary teams is preferred.
  • Bachelors degree or equivalent In some cases an equivalency, consisting of a combination of appropriate education, training and/or directly related experience, will be considered sufficient for an individual to meet the requirements of the role.
  • Must be legally authorized to work in the United States without sponsorship.
  • Must be able to pass a comprehensive background check, which includes a drug screening.

Nice To Haves

  • Experience in life sciences, biotech, or manufacturing environment is preferred.

Responsibilities

  • Data Engineering and Architecture. Build, develop, and deploy scalable data pipelines and ETL/ELT processes using Databricks.
  • Engineer robust data solutions to integrate enterprise data sources, including ERP, CRM, laboratory, and manufacturing systems.
  • Develop reusable frameworks and templates to accelerate data delivery and ensure consistency across domains.
  • Cross-Platform Data Integration Implement and maintain high-performance data connections across Databricks, Snowflake, and Iceberg environments.
  • Author and optimize complex SQL queries, transformations, and data models for analytics and reporting use cases.
  • Support data Lakehouse and data mesh initiatives to enable seamless access to trusted data across the organization.
  • Governance, Quality, and Security. Apply data governance, lineage, and security controls using Unity Catalog, Delta Live Tables, and related technologies.
  • Partner with compliance and cybersecurity teams to uphold data privacy, GxP, and regulatory standards.
  • Establish monitoring, auditing, and optimization processes for ongoing data quality assurance.
  • Collaborate with data scientists, architects, and business partners to build and implement end-to-end data solutions.
  • Serve as a technical mentor and leader with vision within the CRG data engineering community.
  • Contribute to critical initiatives for digital platform modernization and advanced analytics enablement.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service