Data Engineer

InTulsa Initiative LLCTulsa, OK
Hybrid

About The Position

Tulsa For You and Me is a portfolio of talent and economic development programs focused on strengthening Tulsa's economic vitality and expanding opportunity. Integrated Strategies is the centralized data, systems, and analytics team within Tulsa For You, aiming to enhance program effectiveness and decision-making through robust internal infrastructure. This role is a key technical contributor within the Data & Systems environment, responsible for building and maintaining the backend infrastructure that supports reporting, analytics, integrations, and data operations. The position requires hands-on experience in Python-based data engineering, cloud data environments, relational databases, and backend systems integration. The Data Engineer will manage the full lifecycle of data movement, from ingestion and transformation to warehouse optimization, monitoring, and troubleshooting, ensuring reliable data for decision-making. Success hinges on initiative, resourcefulness, and the ability to tackle unfamiliar technical challenges collaboratively in a dynamic setting.

Requirements

  • 4+ years of professional experience in data engineering, cloud data infrastructure, backend integrations, or related technical roles supporting enterprise data environments.
  • 3+ years of hands-on experience developing and maintaining production-grade data pipelines, transformations, and backend workflows using Python.
  • Demonstrated experience building or supporting API-based integrations and automated cloud data workflows across multiple systems.
  • Experience working within version-controlled development environments and applying collaborative software engineering practices.
  • Strong capability designing and building reliable, scalable data pipelines and backend data workflows in cloud-based environments.
  • Strong command of SQL, relational databases, and data modeling concepts, with the ability to structure and optimize data for downstream use.
  • Proven ability to integrate and manage data across multiple systems, including APIs, databases, cloud services, and third-party platforms.
  • Strong understanding of modern software engineering and deployment practices, including version control, testing, code review, and release workflows.
  • Strong working knowledge of cloud environments and the core services that support data systems, including storage, compute, orchestration, monitoring, and security.
  • Strong troubleshooting and problem-solving ability, with a focus on diagnosing failures, improving system reliability, and maintaining data integrity.
  • Ability to operate effectively in evolving technical environments, bringing structure, sound judgment, and continuous improvement to systems and processes.
  • Ability to work independently and collaboratively with technical teammates, vendors, analysts, and internal stakeholders.
  • Strong written and verbal communication skills, including the ability to document systems clearly and explain technical concepts to technical and non-technical audiences.

Nice To Haves

  • Experience designing and supporting cloud-based data solutions within GCP and Azure strongly preferred
  • Exposure to Infrastructure as Code (IaC), deployment automation, or related practices used to manage cloud-based technical environments.
  • Experience working in Agile, sprint-based technical environments and using workflow management tools such as Jira and/or JSM.

Responsibilities

  • Build, maintain, and continuously improve scalable data pipelines, integrations, and warehouse workloads that support reliable and efficient data operations across the organization.
  • Design and implement backend integrations for new and existing data sources—including APIs, databases, file-based feeds, cloud platforms, and third-party systems—to ensure accurate and dependable data movement.
  • Develop, test, deploy, and support reusable data workflows, transformation jobs, warehouse objects, scripts, and related engineering components.
  • Monitor pipeline, integration, and warehouse performance; proactively troubleshoot failures, optimize system reliability, and resolve data processing issues.
  • Establish and maintain logging, alerting, automated monitoring, and data quality controls that strengthen visibility, issue detection, and long-term operational stability.
  • Support secure data handling, system access controls, and governance practices to ensure organizational data is managed responsibly and in accordance with internal standards.
  • Strengthen backend data architecture, engineering standards, and technical workflows to improve maintainability, scalability, and delivery speed.
  • Evaluate and apply modern tools, technologies, and engineering practices that improve the resilience and effectiveness of the overall data environment.
  • Create and maintain technical documentation across pipelines, integrations, warehouse models, and backend workflows to support team continuity and long-term sustainability.
  • Partner closely with analysts, technical teammates, vendors, and internal stakeholders to ensure backend data systems reliably support reporting, analytics, and operational decision-making.
  • Other duties as assigned.

Benefits

  • Work/life flexibility
  • Comprehensive health benefits
  • Paid time off
  • Generous retirement contributions
  • 100% employer paid medical, dental, and long-term disability for full-time employees only
  • Option to add vision and dependents
  • 401(k) employee and employer contributions
  • Paid holidays
  • Employer-paid AD&D life insurance, with employee option to add supplemental life insurance.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service