Data Management Pipeline Engineer, Vice President

MUFG BankJersey City, NJ
264d$143,000 - $182,000

About The Position

Discover your opportunity with Mitsubishi UFJ Financial Group (MUFG), one of the world's leading financial groups. Across the globe, we're 120,000 colleagues, striving to make a difference for every client, organization, and community we serve. We stand for our values, building long-term relationships, serving society, and fostering shared and sustainable growth for a better world. With a vision to be the world's most trusted financial group, it's part of our culture to put people first, listen to new and diverse ideas and collaborate toward greater innovation, speed and agility. This means investing in talent, technologies, and tools that empower you to own your career. Join MUFG, where being inspired is expected and making a meaningful impact is rewarded. The selected colleague will work at an MUFG office or client sites four days per week and work remotely one day. A member of our recruitment team will provide more details. Job Summary: The Enterprise Data Capabilities team in Technology is focused on a transformative journey that will support our data strategy on how we develop and deliver critical business outcomes, in addition to how we engage with partners across MUFG for Data, AI, and BI services. Our team is seeking an experienced and dynamic individual to join as a Data Management Pipeline Engineer, Vice President to support projects and BAU work efforts.

Requirements

  • 10+ years of overall IT experience.
  • 7+ years of experience in data engineering and data science, ideally focused on enabling and accelerating Data Pipeline Integration Engineering.
  • 5+ years' experience in Data Product Development.
  • Business Acumen: Banking & Financial Services data domain knowledge.
  • Understanding the business context, goals, and challenges of your business.
  • Connecting data to business outcomes.
  • Operational knowledge - optimizing processes, managing resources effectively, and implementing quality control measures.
  • Strong organizational and interpersonal skills.
  • Excellent communication skills with the ability to describe technological concepts in ways the business can understand.
  • Strong leadership skills with the ability to engage with developers, support teams, and other networks to solve problems.
  • Strong problem-solving and analytical skills, and the ability to thrive in a fast-paced, collaborative environment.
  • Must have experience in Data analysis, including creating complex SQL queries.
  • Must have experience in designing a large data lake and lake house experience.
  • Data Modeling and Architecture Design - Working knowledge of 3NF data models, dimensional data models, etc. in financial domains.
  • Must have experience implementing security measures to protect data, understanding compliance requirements like GDPR, and managing access controls.
  • Must have experience implementing Data Quality Controls (using Informatica Data Quality or Collibra Data Quality).
  • Experience managing real-time data pipelines with low-latency SLAs.
  • Experience with creating large-scale data engineering pipelines.
  • Experience working with large datasets, cloud (e.g. AWS/GCP), Infrastructure as Code and developing machine learning models in a cloud environment.

Nice To Haves

  • A master's degree in a relevant technology field is a plus.

Responsibilities

  • Design, build, operate and deploy data pipelines at scale using best practices.
  • Build and publish data Integration patterns & best practices.
  • Create and optimize data pipelines / ETL pipelines for performance.
  • Develop new techniques and data pipelines that will enable various insights for stakeholders.
  • Ensure alignment and integration of data architecture and data models across different data products and platforms.
  • Designing and implementing a robust set of Controls and Reconciliation tools and platforms to support point-to-point and end-to-end comprehensiveness controls and G/L Reconciliations.
  • Identify and resolve performance bottlenecks for production systems.
  • Conduct peer reviews for quality, consistency, and rigor for production-level solutions.
  • Design recommendations & drive system improvement to solve data modernization, data integration, and data consumption needs.
  • Define standards and frameworks for testing data movement and transformation code and data components.
  • Create and optimize CI/CD data pipelines for performance and automated tests.
  • Design, build, operate and deploy real-time data pipelines at scale using AI methods and best practices.
  • Build and manage a team of talented data engineers and collaborate with engineering and product teams for developing data engineering solutions.
  • Apply cutting edge data warehousing, data science and data engineering technologies to automate low-value tasks and enable faster time to market and better reusability of new AI initiatives.
  • Partner with AI R D teams and the Data Platform Team in collecting, creating, curating and maintaining high-quality AI datasets.
  • Stay updated on new tools and development strategies, and bring innovation recommendations to leadership.

Benefits

  • Comprehensive health and wellness benefits.
  • Retirement plans.
  • Educational assistance and training programs.
  • Income replacement for qualified employees with disabilities.
  • Paid maternity and parental bonding leave.
  • Paid vacation, sick days, and holidays.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Senior

Industry

Credit Intermediation and Related Activities

Education Level

Bachelor's degree

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service