About The Position

At U.S. Bank, we’re on a journey to do our best. Helping the customers and businesses we serve to make better and smarter financial decisions and enabling the communities we support to grow and succeed. We believe it takes all of us to bring our shared ambition to life, and each person is unique in their potential. A career with U.S. Bank gives you a wide, ever-growing range of opportunities to discover what makes you thrive at every stage of your career. Try new things, learn new skills and discover what you excel at—all from Day One. Job Description We are seeking a highly skilled Senior Cloud Data Engineer to join our Data Product Engineering team. This role is hands-on and delivery-focused, responsible for designing, building, and operating enterprise-grade data products across a multi‑cloud ecosystem, including Azure, AWS, Databricks, and Snowflake. The ideal candidate brings deep experience in modern cloud data platforms, distributed data processing, and secure data integration. You will collaborate closely with product managers, analytics teams, data governance, and security partners to ensure data products are scalable, reliable, compliant, and reusable across the enterprise. This role plays a key part in evolving our data platforms from legacy and first‑generation architectures to cloud‑native, domain-oriented data products while adhering to financial services governance and control requirements.

Requirements

  • Bachelor’s degree, or equivalent work experience
  • Three to five years of relevant experience
  • 8+ years of experience in data engineering, with significant experience on cloud platforms.
  • Proven hands-on experience building and operating data solutions in Azure and/or AWS.
  • Strong experience delivering production-grade data pipelines and data products.
  • Solid understanding of data governance, data quality, and security concepts in regulated environments.
  • Excellent communication skills and ability to collaborate across engineering, product, and governance teams.
  • Experience with data architecture and platform design in large enterprises.
  • Strong hands-on experience with Azure Data Platform services, including: Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics (or Fabric equivalent experience)
  • Experience with AWS data services, such as AWS Glue, S3, and event-driven integrations.
  • Deep experience with Databricks (Spark, Delta Lake, performance tuning).
  • Strong working knowledge of Snowflake, including data modeling, ingestion patterns (e.g., Snowpipe), and data sharing.
  • Expertise in Apache Spark for large-scale data processing.
  • Experience building batch and near-real-time data pipelines.
  • Strong SQL skills and experience with dimensional and analytical data modeling.
  • Experience designing reusable, domain-oriented data products.
  • Experience with API-based integrations (REST; familiarity with SOAP and GraphQL is a plus).
  • Hands-on experience integrating with API gateways.
  • Understanding of messaging and streaming platforms such as Kafka, MQ, AWS SQS, or RabbitMQ.
  • Strong understanding of IAM, RBAC, OAuth 2.0, TLS/mTLS, and JWT.
  • Experience implementing secure data access patterns in cloud environments.
  • Familiarity with data cataloging, lineage, and metadata management concepts.
  • Experience enabling self-service analytics and BI using tools such as Power BI, Tableau, or equivalent.

Nice To Haves

  • Support AI initiatives through the data platform and data products.
  • Prior experience in financial services or other highly regulated industries.
  • Professional certifications in Microsoft Azure and/or AWS.
  • Strong problem-solving skills and a track record of delivering scalable, efficient data solutions.
  • Master’s degree in a relevant technical field.

Responsibilities

  • Design, build, and maintain cloud-native data pipelines and data products across Azure and AWS using Databricks and Snowflake.
  • Lead and contribute to the modernization and migration of on‑prem and legacy data platforms to cloud-based solutions.
  • Implement batch and streaming data processing patterns using Spark and cloud-native services.
  • Partner with data governance, security, and risk teams to ensure data products comply with enterprise governance, data privacy, and regulatory requirements.
  • Enable secure data sharing and access patterns across domains and platforms using appropriate controls.
  • Define and promote data engineering best practices, including CI/CD, testing, observability, performance tuning, and cost optimization.
  • Collaborate with product owners and analytics teams to translate business requirements into well-modeled, high-quality datasets.
  • Work closely with cloud and security architects to implement secure, scalable, and resilient data solutions.
  • Support and mentor junior engineers through design reviews, code reviews, and technical guidance.

Benefits

  • Healthcare (medical, dental, vision)
  • Basic term and optional term life insurance
  • Short-term and long-term disability
  • Pregnancy disability and parental leave
  • 401(k) and employer-funded retirement plan
  • Paid vacation (from two to five weeks depending on salary grade and tenure)
  • Up to 11 paid holiday opportunities
  • Adoption assistance
  • Sick and Safe Leave accruals of one hour for every 30 worked, up to 80 hours per calendar year unless otherwise provided by law
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service