Senior Data Engineer

BarclaysJefferson, CO
Onsite

About The Position

Embark on a transformative journey as a Senior Data Engineer. At Barclays, our vision is clear –to redefine the future of banking and help craft innovative solutions. As a Senior Data Engineer on our Marketing and Communications Platform team, you will drive the next wave of innovation that transforms how Barclays connects with millions of customers across the U.S. Consumer Bank. This role sits at the heart of our digital communication strategy—designing resilient and customer‑centric solutions that power personalized engagement at massive scale. You will produce end to end data warehouse products including working on tasks such as requirements gathering, architecting the schema objects, building the applications objects and creating a finished data fabric product. Your work will directly help Barclays accelerate modernization, enhance customer experience, and deliver high‑impact capabilities across our channels.

Requirements

  • Design, develop, and maintain end‑to‑end ETL/ELT pipelines on AWS and Snowflake data models, schemas, virtual warehouses, and data sharing
  • Data ingestion, transformation, validation, and orchestration for structured and semi‑structured data and build Python‑based data pipelines, reusable frameworks, automation, and unit tests
  • Implement orchestration using Airflow, AWS Glue, Lambda, DBT, or similar tools and ensure data quality, reliability, monitoring, logging, and alerting across pipelines and apply data security, governance, and access controls (RBAC, masking, compliance standards)
  • Translate business requirements into technical designs, including source‑to‑target mappings and runbooks and provide production support, issue resolution, root cause evaluation, and minimize downtime
  • Support CI/CD, source control, code reviews, and adherence to development standards and contribute to logical and physical data models and system integration testing

Nice To Haves

  • Experience with enterprise-scale message streaming and eventing platforms (such as Kafka, AWS Kinesis), enabling real-time data ingestion, event-driven architectures, and low-latency data pipelines
  • Problem-solving and evaluative capability, providing technical guidance in designing, troubleshooting, and evolving multi-faceted data platforms, architectures, and system integrations
  • Stakeholder and partner guidance, effectively working across business, product, compliance, and global delivery teams—driving alignment and delivering solutions
  • Deep familiarity with the banking and financial services domain, including security, compliance, data protection, and regulatory expectations for Data and cloud-based solutions
  • progressive experience delivering enterprise-scale Data Warehouse and engineering solutions

Responsibilities

  • Provision of subject matter expertise to support the collaboration between the product owner and the technical side of product development.
  • Support the development and implementation of the product strategy and vision defined in the product roadmap and communicate them with the relevant stakeholders and the development team.
  • Collaboration with internal stakeholders to gather and prioritise product requirements and features based on business value and feasibility that are well defined, measurable and secure.
  • Development and implementation of assessments to ensure continuous testing and improvement of product quality and performance.
  • Monitoring of product performance to identify opportunities for optimisation that meets the banks performance standards.
  • Stay abreast of the latest industry technology trends and technologies, to evaluate and adopt new approaches to improve product development and delivery.
  • Design, develop, and maintain end‑to‑end ETL/ELT pipelines on AWS and Snowflake data models, schemas, virtual warehouses, and data sharing
  • Data ingestion, transformation, validation, and orchestration for structured and semi‑structured data and build Python‑based data pipelines, reusable frameworks, automation, and unit tests
  • Implement orchestration using Airflow, AWS Glue, Lambda, DBT, or similar tools and ensure data quality, reliability, monitoring, logging, and alerting across pipelines and apply data security, governance, and access controls (RBAC, masking, compliance standards)
  • Translate business requirements into technical designs, including source‑to‑target mappings and runbooks and provide production support, issue resolution, root cause evaluation, and minimize downtime
  • Support CI/CD, source control, code reviews, and adherence to development standards and contribute to logical and physical data models and system integration testing

Benefits

  • medical, dental and vision coverage
  • 401(k)
  • life insurance
  • other paid leave for qualifying circumstances
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service