Database Developer

Greater Kansas City Community FoundationKansas City, MO
1dHybrid

About The Position

The Database Developer is responsible for designing, building, and maintaining reliable data integration pipelines that power enterprise reporting, analytics, and operational systems. This role focuses on the consistent and secure movement of data across applications, warehouses, and cloud platforms. The Database Developer works across the organization to provide reliable, analytics-ready data that informs effective decision-making. This is a full-time, exempt, salaried position reporting to the Director of Data Technologies. Candidates must be local to Kansas City, MO, and after a successful training period, there are opportunities to work remotely.

Requirements

  • At least 3-6 years of related experience with a bachelor's degree. An equivalent combination of education and experience will be considered.
  • Hands-on experience developing and supporting ETL/ELT pipelines in Azure, SQL Server, or comparable data environments.
  • Strong SQL skills with the ability to troubleshoot data quality, transformation, and performance issues.
  • Experience integrating data from APIs, SaaS platforms, files, and relational databases (including JSON and XML payloads).
  • Experience managing job orchestration, scheduling, documentation, retries, and failure of recovery for data workflows.
  • Familiarity with version control (Git) and deployment practices for data pipelines and integration code.
  • Ability to monitor pipeline execution and proactively identify and remediate reliability issues.

Nice To Haves

  • Experience with Azure Data Factory, Fabric Data Factory, SSIS, or similar tools.
  • Exposure to cloud data warehouses or analytics platforms.
  • Experience working in regulated, audit-conscious, or highly governed environments.

Responsibilities

  • Design, build, and maintain scalable batch and near-real-time data pipelines.
  • Develop integrations using tools such as Azure Data Factory, Fabric Data Factory, SSIS, or equivalent technologies.
  • Maintain clear, well-documented data processes that ensure secure, reliable data delivery.
  • Ensure pipelines are secure, reliable, and well-documented.
  • Develop structured ETL and ELT processes that support data warehouse models and downstream analytics.
  • Partner with staff to ensure data structures align with reporting and semantic model needs.
  • Manage orchestration, scheduling, and dependencies across data workflows.
  • Implement automation to improve reliability, monitoring, and recovery from failures.
  • Partner with leadership to evaluate, govern, and leverage AI-enabled capabilities within the data platform ecosystem.
  • Leverage AI-assisted tooling to improve platform reliability and operations efficiencies.
  • Monitor pipeline execution and proactively identify failure patterns.
  • Implement improvements to increase resiliency, observability, and operational predictability.
  • Work with application owners (e.g., financial systems, CRM platforms) to understand data structures, APIs, and integration requirements.
  • Maintain clear documentation for data flows, lineage, integration logic, and operational processes.
  • Support ingestion of files, APIs, JSON/XML payloads, and database sources.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service