Data Engineer, Data Innovation and Tools Rationalization

U.S. BankMinneapolis, MN
1dHybrid

About The Position

We are seeking a skilled and motivated data engineer to join the Data Innovation & Tools rationalization team within the Enterprise Data Office. This role will contribute to the modernization of enterprise data capabilities by designing, building, and supporting scalable data product solutions aligned with the Enterprise Data Strategy. The data engineer will work with modern data platforms, cloud technologies, and analytics tools to help improve data accessibility, reliability, and reuse across the organization. We are seeking a highly skilled and forward-thinking Data Engineer to join the Data Innovation and Tools Rationalization team. This role is focused on building and scaling next-generation data product engineering patterns that enable faster, more consistent, and more reliable delivery across the enterprise. The ideal candidate will combine stand-out hands-on engineering skills with a product mindset. You will design reusable frameworks, define engineering standards, evaluate emerging technologies, and partner closely with execution and enablement teams to operationalize modern data patterns at scale. This role plays a critical part in accelerating platform adoption, improving developer productivity, and reducing fragmentation across data and analytics solutions.

Requirements

  • Deep understanding of financial institution/Banking concepts
  • Strong understanding of modern data engineering concepts, including batch and streaming data processing, data modeling, and data product design.
  • Experience building scalable data solutions on cloud-based data platforms.
  • Familiarity with enterprise data ecosystems and shared platform models.
  • Ability to assess tradeoffs across tools, architectures, and implementation approaches.
  • Strong analytical and problem-solving skills with a focus on root cause analysis and optimization.
  • Proficiency with big data technologies (Spark, Airflow, Hadoop, Hive)
  • Hands-on experience with Snowflake and Databricks, including performance tuning.
  • Proficiency in SQL and Python, with experience building production-grade data pipelines.
  • Experience with CI/CD pipelines and infrastructure-as-code patterns for data platforms.
  • Familiarity with orchestration and workflow management tools.
  • Experience developing reusable libraries, templates, or internal frameworks.
  • Exposure to cloud platforms such as Azure, AWS, or GCP and cloud-native data services.
  • Understanding data quality, observability, and monitoring practices.
  • Familiarity with AI and ML tooling as it relates to data engineering and platform enablement is a plus.
  • Bachelor’s Degree in a quantitative field such as computer science, data science, mathematics, or statistics.
  • 5 to 7 years of statistical and/or analytical experience.

Nice To Haves

  • Typically, 6+ years of experience in data engineering, analytics engineering, or platform engineering roles.
  • Demonstrated experience building and supporting data solutions in a cloud environment.
  • Proven track record of designing reusable components or standards adopted by multiple teams.
  • Experience working in regulated or large-scale enterprise environments preferred.
  • Strong organizational skills with the ability to work on multiple initiatives concurrently.
  • Deep understanding of banking and financial institutions concepts.
  • Knowledge of banking regulation and requirements for regulatory reporting.
  • Strong analytical, organizational and problem-solving.
  • Hands-on experience with programming languages such as Python and SQL.
  • Proficiency with big data technologies including Hadoop, Hive, and Spark.
  • Expertise in visual analytics tools such as Power BI, Tableau, or equivalent platforms.
  • Experience with Power Platform tools such as Power Automate and Power Apps
  • Proven track record in automating and optimizing ETL processes at scale.
  • Hands-on experience with cloud platforms (e.g., Azure, AWS, GCP) and cloud-native data services.
  • Excellent written and verbal communication skills for documenting technical processes and engaging with cross-functional teams.

Responsibilities

  • Designing and building next-generation data product engineering patterns on modern cloud platforms including Snowflake and Databricks.
  • Developing reusable engineering assets such as frameworks, build kits, CI/CD templates, and performance optimization approaches.
  • Partnering with Enablement and Execution teams to operationalize and scale data engineering patterns across delivery teams.
  • Evaluating, testing, and experimenting with emerging data and AI tools, platforms, and services.
  • Participate in technical proofs of concept, comparing alternative solutions, and making data-driven recommendations for platform and tool rationalization.
  • Documenting project outcomes, transition plans, adoption guides, and solution usage scripts to support enterprise rollout.
  • Supporting platform modernization efforts through hands-on development, tuning, and optimization.
  • Collaborating with data product owners, architects, and platform teams to align engineering solutions with enterprise data strategy.

Benefits

  • Healthcare (medical, dental, vision)
  • Basic term and optional term life insurance
  • Short-term and long-term disability
  • Pregnancy disability and parental leave
  • 401(k) and employer-funded retirement plan
  • Paid vacation (from two to five weeks depending on salary grade and tenure)
  • Up to 11 paid holiday opportunities
  • Adoption assistance
  • Sick and Safe Leave accruals of one hour for every 30 worked, up to 80 hours per calendar year unless otherwise provided by law
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service