Senior Data Engineer

PA ConsultingBoston, MA
$155,000 - $180,000Hybrid

About The Position

We’re an innovation and transformation consultancy that believes in the power of ingenuity to build a positive-human future in a technology-driven world. Our diverse teams of experts combine innovative thinking with breakthrough-technologies to progress further, faster. This is a unique, multi-year, project-based opportunity to build and grow a clinical data registry platform over many years working with a dedicated team of collaborators and customers. As a Data Engineer for our cutting-edge medical data registry, you'll be at the forefront of managing, optimizing, and expanding our data infrastructure, enabling critical insights that can positively impact patient outcomes. This role provides a dynamic mix of innovative new development—enhancements, optimizations, and Proofs of Concept—alongside meaningful hands‑on ownership of production pipelines, data flows, and platform operations. You will contribute to system improvements while also owning the monitoring, troubleshooting, tuning, and day-to-day reliability of the existing data ecosystem. The company believes in the power of ingenuity to build a positive human future, creating opportunity from complexity with diverse teams of experts combining innovative thinking and breakthrough technologies to achieve enduring results for clients across various sectors globally.

Requirements

  • At least 8 years of industry working experience as a data engineer
  • Advanced SQL and Python
  • Expertise in the design and construction of Big Data Lakes and Data Warehouses capable of ingesting, standardizing, and serving billions of data rows spanning diverse datasets ranging from tens to hundreds
  • Experience building dynamic, metadata driven pipelines and analyses
  • Experience maintaining and supporting production data systems, including pipeline monitoring, troubleshooting, and delivering long‑term reliability and optimization enhancements
  • Building and managing fully automated data pipelines (ETL, ELT, ELTL)
  • Designing and building data interfaces to source systems
  • Combining and transforming data into the appropriate format for storage
  • Developing data sets for analytics purposes
  • Developing pipelines that can handle common issues/errors in a robust and automated way
  • Implementing data quality checks, observability, and cost/performance optimization
  • Cloud experience in Azure, AWS or GCP
  • Ability to balance enhancement work with operational responsibilities in a production environment

Nice To Haves

  • Spark / PySpark experience highly preferable
  • Working in Agile and DevOps environments
  • Basic Python, Bash, or PowerShell for automation
  • Data modelling – Kimball, Data Vault, Star/Snowflake schema, Query-first etc.
  • Data visualisation in Power BI, Tableau, Qlik or similar
  • Architecting Data Platforms - designing BI/MI/Analytics solutions using Big Data, Relational or Streaming technologies
  • Experience with monitoring/observability tools (logging, alerting, pipeline health dashboards)
  • One or more of the following certifications: Microsoft Certified: Azure Data Engineer Associate, AWS Certified Data Analytics - Specialty, GCP Professional Data Engineers

Responsibilities

  • Managing, optimizing, and expanding our data infrastructure
  • Enabling critical insights that can positively impact patient outcomes
  • Innovative new development—enhancements, optimizations, and Proofs of Concept
  • Hands‑on ownership of production pipelines, data flows, and platform operations
  • Contribute to system improvements
  • Owning the monitoring, troubleshooting, tuning, and day-to-day reliability of the existing data ecosystem
  • Designing and building data interfaces to source systems
  • Combining and transforming data into the appropriate format for storage
  • Developing data sets for analytics purposes
  • Developing pipelines that can handle common issues/errors in a robust and automated way
  • Implementing data quality checks, observability, and cost/performance optimization
  • Maintaining and supporting production data systems, including pipeline monitoring, troubleshooting, and delivering long‑term reliability and optimization enhancements
  • Building and managing fully automated data pipelines (ETL, ELT, ELTL)

Benefits

  • Group medical insurance
  • Health Savings Account with company match
  • Teladoc and informed Nurse line resources
  • Long term care plan
  • Group dental insurance
  • Vision plan
  • 401(k) Savings Plan with company profit sharing contribution
  • Commuter and Parking tax-savings benefit
  • 15 days paid vacation days with the opportunity to buy five additional days
  • 10 paid Holidays plus 10 paid sick days
  • Company and Voluntary income protection benefits
  • Gym and health incentive reimbursement
  • Pet and legal insurance Plans
  • Employee Assistance Plan
  • Annual performance-based bonus
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service