Lead Data Integration Engineer (Snowflake + Python)

DTCC Candidate Experience SiteTampa, FL
7hHybrid

About The Position

Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. The Information Technology group delivers secure, reliable technology solutions that enable DTCC to be the trusted infrastructure of the global capital markets. The team delivers high-quality information through activities that include development of essential, building infrastructure capabilities to meet client needs and implementing data standards and governance. Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: Being a member of IT FinSight Delivery team, the lead data integration engineer role specializes in planning, detailing technical requirements, designing, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that tackle complex business problems.

Requirements

  • Minimum of 6 years of related experience
  • Bachelor's degree preferred and/or equivalent experience
  • Looking for a strong ETL developer with hands-on experience in developing large scale Data engineering pipelines for financial services and preferably for risk management.
  • Minimum 6 years of related experience in building and maintaining large scale Data warehouse applications in cloud.
  • Minimum 3 years of experience in python as a developer, developing complex data pipelines loading data into Snowflake Database.
  • Minimum 6 years of Hands-on experience in writing, tuning, and managing complex SQL, creating stored procedures and database objects for a large-scale data warehouse systems in Snowflake, Oracle etc.
  • Must have knowledge in Dimensional modelling techniques like Star schema, type 2 slowly changing dimension and various type of fact tables.
  • Hands on experience in shell scripts for handling script-based ELT pipelines using Python or Snowpark.
  • Hands on experience in creating and managing Autosys JILs for orchestration or Airflow.
  • Knowledge in ETL tools like Talend or similar tools and create heterogenous pipelines.
  • Strong hands-on experience code versioning in bitbucket, GIT, Liquibase and managing multiple versions of release and CICD pipelines.
  • Good understanding of Enterprise Data integration concepts, Data warehouse modelling and Data architecture patterns.
  • Worked on Agile projects and knowledge of Jira for tracking and updating project tasks.

Responsibilities

  • Lead engineering and development focused projects from start to finish with minimal supervision.
  • Provide technical and operational support for our customer base as well as other technical areas within the company.
  • Review and supervise the system design and architecture.
  • Interact with stakeholder to understand requirements and provide solutions.
  • Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives.
  • Refine and prioritize the backlog for the team in partnership with product management.
  • Groom and guide the team of employees and consultants.
  • Responsible for employee engagement, growth and appraisals.
  • Participate in user training to increase awareness of the platform.
  • Ensure incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues.
  • Ensure quality and consistency of data from source systems and align with data product managers on facilitating resolution of these issues in a consistent manner.
  • Follow DTCC’s ITIL process for incident, change and problem resolution

Benefits

  • Competitive compensation, including base pay and annual incentive
  • Comprehensive health and life insurance and well-being benefits, based on location
  • Pension / Retirement benefits
  • Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service