Sr. Data Engineer

OakworthTalentHomewood, AL
52d

About The Position

Oakworth Capital Bank is expanding and has an excellent opportunity for someone to join our team in Birmingham, AL! We are looking for a full-time Sr. Data Engineer that will play a vital role in supporting our current & future clients. An ideal candidate would meet the qualifications listed below, and more importantly, be able to demonstrate that they live by Oakworth Core Values (Golden Rule, Character, Innovative Spirit, Professionalism, Work Ethic). Summary: The Senior Data Engineer is responsible for building and maintaining a reliable, scalable data platform that supports analytics, reporting, and operational needs across the business. This is a hands-on, execution-focused role grounded in practical data engineering by owning data pipelines, transformations, and infrastructure that can support growth and evolve with the organization’s needs.

Requirements

  • Bachelor’s degree in computer science, information systems, or a related technical field
  • 5+ years in data engineering, including experience in a cloud environment
  • Deep SQL expertise across platforms (e.g., PostgreSQL, SQL Server, cloud-native warehouse platforms)
  • Strong Python skills for data processing and integration (pandas, polars, etc.)
  • Familiarity with orchestration and deployment tooling (e.g., Dagster, Airflow, Azure Data Factory, GitHub Actions)
  • Proven ability to support and scale data infrastructure in a business-facing environment

Nice To Haves

  • Experience with dbt or similar modeling frameworks preferred
  • Financial services experience is preferred

Responsibilities

  • Design, build, and maintain secure, observable ELT pipelines using Python and SQL, supporting structured ingestion from internal systems and external vendors.
  • Lead the transition of legacy, on-prem data flows into a scalable, cloud-based architecture aligned with current and future business needs.
  • Own transformation logic and data modeling patterns to support BI, regulatory reporting, and operational analytics—using tools like dbt where appropriate.
  • Define and enforce structure including naming, documentation, testing and implement monitoring across pipeline health, job failures, and data quality.
  • Collaborate with stakeholders to translate business needs into structured data assets and contribute to broader architectural decisions, including tooling and modeling strategy, while aligning all work with data governance and security expectations.
  • Own the design and execution of core data pipelines that are structured, testable, and observable.
  • Model datasets to support Power BI and other downstream analytics tools.
  • Manage transformations using Python and/or SQL-based modeling tools (e.g., dbt).
  • Contribute to architectural decisions such as storage, orchestration, and modeling patterns as the platform evolves.
  • Interface with application owners, analysts, and business users to translate needs into structured data models.
  • Maintain operational documentation and pipeline transparency for continuity and support.
  • Ensure pipelines and assets align with governance expectations, including access, retention, and classification.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service