Data & Analytics Engineers

Quest GlobalGreenville, SC
4d$60,000 - $80,000Hybrid

About The Position

Quest Global delivers world-class end-to-end engineering solutions by leveraging our deep industry knowledge and digital expertise. By bringing together technologies and industries, alongside the contributions of diverse individuals and their areas of expertise, we are able to solve problems better, faster. This multi-dimensional approach enables us to solve the most critical and large-scale challenges across the aerospace & defense, automotive, energy, hi-tech, healthcare, medical devices, rail and semiconductor industries. We are looking for humble geniuses, who believe that engineering has the potential to make the impossible possible; innovators, who are not only inspired by technology and innovation, but also perpetually driven to design, develop, and test as a trusted partner for Fortune 500 customers. As a team of remarkably diverse engineers, we recognize that what we are really engineering is a brighter future for us all. If you want to contribute to meaningful work and be part of an organization that truly believes when you win, we all win, and when you fail, we all learn, then we're eager to hear from you. The achievers and courageous challenge-crushers we seek, have the following characteristics and skills. Purpose Build the data backbone that powers engineering insight, innovation, and execution—designing, maintaining, and evolving pipelines, semantic models, and analytics layers that deliver trusted, real-time, decision-grade intelligence across the organization.

Requirements

  • Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field; 2+ years in data engineering, BI, or analytics engineering.
  • Ability to design and implement robust data governance (Purview lineage/classification, RLS/OLS, sensitivity labels) and workspace standards; experience with distributed teams.
  • Strong proficiency in SQL, DAX, Power Query; dimensional/star schema modeling; hands-on with Microsoft Fabric (Workspaces, Dataflows Gen2, Lakehouses, Notebooks) and Power BI deployment pipelines with Git.
  • Authorized to work in USA or posted locations

Nice To Haves

  • Python, DevOps for BI, telemetry based dashboarding; familiarity with vectorization/embedding pipelines to support RAG style analytics and copilots in partnership with AI teams.

Responsibilities

  • Engineer medallion/lakehouse data pipelines on Microsoft Fabric/OneLake with accuracy, security, and consistency; manage Gitbacked artifacts and deployment pipelines across dev/test/prod.
  • Build and maintain semantic models that unify engineering, HR, financial, and operational datasets; deliver data products with clear SLAs/SLOs (freshness, completeness, accuracy) and semantic conventions.
  • Implement enterprise lineage and classification with Microsoft Purview; enforce RLS/OLS, sensitivity labels, and auditable approvals for certified content.
  • Integrate core systems: HRIS (e.g., Workday), finance (e.g., SAP, PLM), engineering (Azure DevOps/Jira, CLM, PLM), and tool telemetry (App Insights/GitHub/Copilot) as governed domains.
  • Optimize performance end-to-end: query folding, composite models, incremental refresh, DAX tuning, model compression, OneLake layout, and capacity/cost management; introduce caching/cubes for high read workloads.
  • Deliver near real time dashboards where required via streaming/short latency ingestion and appropriate refresh strategies.
  • Operationalize DataOps: automated data/dataset tests, DAX/Power Query unit tests, schema change detection, and deployment gates in Fabric pipelines/Power BI deployment pipelines.
  • Champion data literacy via playbooks, certified content catalogs, and office hours; measure consumer NPS and adoption.

Benefits

  • 401(k)
  • 401(k) matching
  • Dental insurance
  • Health insurance
  • Life insurance
  • Paid time off
  • Referral program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service