Data Engineering and Data Analyst (exp in SAP Data)

Derex Technologies IncBoston, MA

About The Position

Role: Data Engineering and Data Analyst (exp in SAP Data) Location: Boston, MA Type: Contract Duration: 6-12 months Below are mandatory skills Google Big Query and/or Snowflake SAP Data Experience with at least one BI tool: Looker, Tableau, or Qlik Sense (multi-tool experience preferred). Consultant will work across Google Big Query and/or Snowflake to develop analytics-ready datasets and power dashboards in Looker, Tableau, and/or Qlik Sense. Responsibilities Data engineering Build and maintain pipelines from SaaS tools, operational databases, APIs, and flat files into BigQuery/Snowflake. Design curated datasets and semantic-ready tables/views; improve query performance (partitioning/clustering, pruning, aggregation strategies). Set up data observability: load monitoring, failure alerts, and incident/runbook documentation. Troubleshoot pipeline failures and data discrepancies; conduct root-cause analysis and remediation. Analytics Gather requirements, define metrics/KPIs, and translate business needs into trusted reporting. Build and optimize dashboards in Looker/Tableau/Qlik Sense with strong usability, performance, and governance. Enable self-service: certified datasets, standardized dimensions/measures, consistent metric definitions. Provide ad-hoc analysis and recommendations; communicate insights clearly to stakeholders.

Requirements

  • Strong SQL and hands-on experience with BigQuery and/or Snowflake.
  • Experience with at least one BI tool: Looker, Tableau, or Qlik Sense (multi-tool experience preferred).
  • Experience working with SAP data.
  • Solid data modeling fundamentals (dimensional modeling, metric grain, conformed dimensions).
  • Experience with pipeline/orchestration concepts (scheduling, retries, idempotency, incremental loading).
  • Strong stakeholder management skills; can drive clarity and deliver iteratively.

Responsibilities

  • Build and maintain pipelines from SaaS tools, operational databases, APIs, and flat files into BigQuery/Snowflake.
  • Design curated datasets and semantic-ready tables/views; improve query performance (partitioning/clustering, pruning, aggregation strategies).
  • Set up data observability: load monitoring, failure alerts, and incident/runbook documentation.
  • Troubleshoot pipeline failures and data discrepancies; conduct root-cause analysis and remediation.
  • Gather requirements, define metrics/KPIs, and translate business needs into trusted reporting.
  • Build and optimize dashboards in Looker/Tableau/Qlik Sense with strong usability, performance, and governance.
  • Enable self-service: certified datasets, standardized dimensions/measures, consistent metric definitions.
  • Provide ad-hoc analysis and recommendations; communicate insights clearly to stakeholders.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service