Data Engineer

Tulsa For YouTulsa, OK
Hybrid

About The Position

Tulsa For You and Me is a portfolio of talent and economic development programs focused on strengthening Tulsa's economic vitality, expanding opportunity, and fostering an inclusive city. Operated by the George Kaiser Family Foundation (GKFF), it includes organizations like Tulsa Remote, Tulsa Innovation Labs, and Build In Tulsa. The Integrated Strategies team, which includes Data & Systems, CRM Platform Solutions, and Analytics, supports these programs by building data environments, technical solutions, and analytical insights. In this Data Engineer role, you will be a key technical contributor to the Data & Systems environment, responsible for building and maintaining the backend infrastructure that supports reporting, analytics, integrations, and data operations. This role involves the full lifecycle of data movement, from ingestion and transformation to warehouse optimization, monitoring, and troubleshooting, ensuring reliable and well-structured data for decision-making. The ideal candidate is proactive, resourceful, enjoys tackling technical challenges, and collaborates effectively in a dynamic setting.

Requirements

  • 4+ years of professional experience in data engineering, cloud data infrastructure, backend integrations, or related technical roles.
  • 3+ years of hands-on experience developing and maintaining production-grade data pipelines, transformations, and backend workflows using Python.
  • Demonstrated experience building or supporting API-based integrations and automated cloud data workflows across multiple systems.
  • Experience working within version-controlled development environments and applying collaborative software engineering practices.
  • Strong capability designing and building reliable, scalable data pipelines and backend data workflows in cloud-based environments.
  • Strong command of SQL, relational databases, and data modeling concepts.
  • Proven ability to integrate and manage data across multiple systems, including APIs, databases, cloud services, and third-party platforms.
  • Strong understanding of modern software engineering and deployment practices (version control, testing, code review, release workflows).
  • Strong working knowledge of cloud environments and core data system services (storage, compute, orchestration, monitoring, security).
  • Strong troubleshooting and problem-solving ability, with a focus on diagnosing failures, improving system reliability, and maintaining data integrity.
  • Ability to operate effectively in evolving technical environments, bringing structure, sound judgment, and continuous improvement.
  • Ability to work independently and collaboratively with technical teammates, vendors, analysts, and internal stakeholders.
  • Strong written and verbal communication skills, including the ability to document systems clearly and explain technical concepts to diverse audiences.

Nice To Haves

  • Experience designing and supporting cloud-based data solutions within GCP and Azure.
  • Exposure to Infrastructure as Code (IaC), deployment automation, or related practices.
  • Experience working in Agile, sprint-based technical environments.
  • Experience using workflow management tools such as Jira and/or JSM.

Responsibilities

  • Build, maintain, and continuously improve scalable data pipelines, integrations, and warehouse workloads to support reliable and efficient data operations.
  • Design and implement backend integrations for new and existing data sources (APIs, databases, file-based feeds, cloud platforms, third-party systems).
  • Develop, test, deploy, and support reusable data workflows, transformation jobs, warehouse objects, scripts, and related engineering components.
  • Monitor pipeline, integration, and warehouse performance; troubleshoot failures, optimize system reliability, and resolve data processing issues.
  • Establish and maintain logging, alerting, automated monitoring, and data quality controls.
  • Support secure data handling, system access controls, and governance practices.
  • Strengthen backend data architecture, engineering standards, and technical workflows.
  • Evaluate and apply modern tools, technologies, and engineering practices.
  • Create and maintain technical documentation for pipelines, integrations, warehouse models, and backend workflows.
  • Collaborate with analysts, technical teammates, vendors, and internal stakeholders to ensure backend data systems support reporting, analytics, and decision-making.
  • Perform other duties as assigned.

Benefits

  • Work/life flexibility
  • Comprehensive health benefits
  • 100% employer-paid medical, dental, and long-term disability for full-time employees
  • Option to add vision and dependents
  • 401(k) employee and employer contributions
  • Paid time off
  • Paid holidays
  • Employer-paid AD&D life insurance
  • Employee option to add supplemental life insurance
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service