Data Engineer

HarbourVest PartnersBoston, MA
1d$100,000 - $160,000Hybrid

About The Position

For over forty years, HarbourVest has been home to a committed team of professionals with an entrepreneurial spirit and a desire to deliver impactful solutions to our clients and investing partners. As our global firm grows, we continue to add individuals who seek a collaborative, open-door culture that values diversity and innovative thinking. In our collegial environment that’s marked by low turnover and high energy, you’ll be inspired to grow and thrive. Here, you will be encouraged to build on your strengths and acquire new skills and experiences. We are committed to fostering an environment of inclusion that promotes mutual respect among all employees. Understanding and valuing these differences optimizes the potential of both the individual and the firm. HarbourVest is an equal opportunity employer. This position will be a hybrid work arrangement. You will receive 18 remote workdays per quarter to use at your discretion, subject to manager approval. For example, you may choose to work in the office 4 days per week and take one remote day weekly (typically 13 weeks per quarter), leaving 5 additional remote days to be used as needed. The data engineer will work toward the transformation of our firm’s data infrastructure, primarily using our MDM platform, Snowflake data environment, plus the Azure data stack. The resource joining us will partner with product owners, data owners, project managers, business users, peer data engineers, and infrastructure engineers to form complete end-to end-solutions. The resource will enjoy working in an evolving, fast-paced environment, and bring a work style marked by high energy, flexibility, quick learning, and collaboration. The role spans end‑to‑end solution delivery: data modeling, MDM hub operations, data quality integration, workflow and stewardship, performance tuning, and compliant operations. The ideal candidate is someone who is: Successful track record contributing to data-platform projects of significant scale and complexity, involving modern MDM data platform implementations. 1+ year of Snowflake-specific experience required Experience organizing data schemas aligned with the needs of business applications Confidence and skill in building and communicating data-flow schematics understandable to both business and technical teams Perseverance, empathy, "give-and-take" attitude, and respect for the feedback and contributions of others Effective time-management skills Has designed, built, and maintained systems that create a single, accurate source of truth for critical business data, focusing on data quality, governance, integration, and lifecycle.

Requirements

  • SQL and Python knowledge
  • Proficiency in Snowflake and Dagster, similar proficiency in Azure data environment a plus.
  • Highly collaborative attitudes and work style; welcoming the inputs of others
  • Confidence and skill to document work items and explain deliverables, to enhance overall team efficiency
  • Capability to balance demands of scope, schedule, and budget
  • Successful track record contributing to data-platform projects of significant scale and complexity, involving modern MDM data platform implementations.
  • 1+ year of Snowflake-specific experience required
  • Experience organizing data schemas aligned with the needs of business applications
  • Confidence and skill in building and communicating data-flow schematics understandable to both business and technical teams
  • Perseverance, empathy, "give-and-take" attitude, and respect for the feedback and contributions of others
  • Effective time-management skills
  • Has designed, built, and maintained systems that create a single, accurate source of truth for critical business data, focusing on data quality, governance, integration, and lifecycle.

Nice To Haves

  • Proficiency in Azure data environment a plus.

Responsibilities

  • Architect and configure multidomain MDM solutions.
  • Develop data models, workflows, and integration patterns aligned with cloud architecture.
  • Develop reusable workflows for data ingestion, quality, transformation, and optimization.
  • Implement and support Snowflake-based data pipelines with source data originating in across a variety of technical environments
  • Analyze business and technical requirements as basis for contributing to sophisticated conceptual, logical, and physical data models
  • Create and implement CI/CD pipelines
  • Organize work and adhere to thorough work tracking, using Agile techniques
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service