Data Platform Analyst

Groendyke TransportEnid, OK
1d

About The Position

The Data Platform Analyst supports operational and financial decision-making by creating dependable datasets, metrics, and pipelines. This role works closely with teams across operations, safety, maintenance, and finance to turn business questions into well-defined requirements and production-ready data solutions using SQL and Python.

Requirements

  • Advanced SQL skills, including complex joins, window functions, CTEs, query optimization, and the ability to read/debug existing SQL code (including stored procedures, functions, and triggers).
  • Python proficiency required, including building data pipelines and automation using common libraries (e.g., pandas) and writing maintainable code.
  • Ability to translate business needs into technical solutions and drive work through delivery.
  • Experience working with structured data models and understanding concepts such as grain, dimensions, and consistent metric definitions.
  • Strong communication skills with both technical and non-technical stakeholders.

Nice To Haves

  • Experience with SQL Server and Microsoft data tooling (or equivalent enterprise data stack).
  • Exposure to REST APIs and common integration patterns.
  • Experience in operational environments where data supports real-time or near-real-time decisions.

Responsibilities

  • Partner with stakeholders to understand goals, define success criteria, and translate business needs into data requirements (definitions, grain, edge cases, and acceptance criteria).
  • Ask clarifying questions early, present options with tradeoffs, and align on the simplest reliable solution.
  • Identify opportunities to improve processes, data capture, and metric definitions to reduce downstream confusion.
  • Write and maintain production-grade SQL (queries, views, stored procedures, and functions) to support dashboards, KPI reporting, and operational workflows.
  • Use Python for data pipelines, automation, validation, and integration tasks (e.g., scheduled loads, transformations, monitoring, and backfills).
  • Optimize and troubleshoot performance issues in SQL workloads and data pipelines.
  • Debug data issues end-to-end by reconciling across systems, identifying root causes, and implementing preventative fixes.
  • Implement data quality checks (completeness, uniqueness, referential integrity, and threshold checks) and automated alerting where appropriate.
  • Document datasets and metrics so definitions are consistent and reusable (business rules, lineage, refresh cadence, known limitations).
  • Improve maintainability through clean design, modular code, version control practices, and clear operational runbooks.
  • Work with application owners and vendors to understand source system behavior and data availability.
  • Contribute to API-based data ingestion when needed (authentication patterns, pagination, rate limits, and payload validation).

Benefits

  • We offer a comprehensive benefits package, including health coverage, a 401(k) plan with employer match, and paid time off.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service