Cyber Data Governance & Data Quality Lead

Sumitomo Mitsui Banking CorporationCharlotte, NC
59dHybrid

About The Position

This critical role will be responsible for leading the Data Governance & Data Quality team for the AD Information Security department and will report to the Head of Cybersecurity Strategic Data & Business Management. This is part of the overall cyber data initiative focusing on building the security and risk data platforms for the Cybersecurity Data Lakehouse (CyberDW). The goal of the CyberDW is to centralize the ISDAD data as well as establish effective data governance around the data sources and its data lineage. This newly formed role will create a team that will collaborate with developers, data owners, governance leads and business analysts within the Information Technology (IT) department as well as other stakeholders aligned with the applications owned by ISDAD.

Requirements

  • Bachelor’s degree in computer science, Information Security, Data Management, or related field
  • 10+ years’ experience in IT development, data governance, data analyst or related roles, preferably in a highly regulated environment such as financial services
  • Hands-on experience in data management & governance tools (e.g., Collibra DIP/DQIM/DQE). Strong prior experience with Data Profiling, KDE and DQ Rule creation along with the ability to identify patterns and detect statistical anomalies in datasets is highly preferred.
  • Hands-on experience in SQL queries and understanding of ETL processes. Proficient in Python, PySpark, Java or similar high-level server-side languages for scripting QA and alert processing.
  • Familiarity with DevOps and CI/CD pipelines (Jenkins, GitLab, Azure DevOps). Experience with Azure cloud services, Azure Data Factory, Gen 2, Azure Databricks is a plus.
  • Experience with JIRA, Xray, REST API web services and microservice architecture. Knowledge of IBM Tivoli Workload Scheduler a plus.
  • Knowledge of enterprise Information Security data (i.e., Phishing, Identity Management, Privileged Access, Cloud Security, Incident Response, Vulnerability Management, Threat Detection)
  • Strong problem solving and analytical skills, with an initiative-taking and results-oriented approach.

Nice To Haves

  • Exposure to PowerBI for data visualization and reporting is a plus.

Responsibilities

  • Operate as the Federated Data Officer for ISDAD.
  • Analyze data sources, data dictionaries, business glossary, data mappings, ETL processes and data within CyberDW to understand their structure, relationships, dependencies to quantify and document the data quality rules, controls and lineage around the key data elements (KDEs).
  • Partner with the data owners and stakeholders to create KDEs and DQ rules needed for CyberDW data.
  • Partner with the Data Owners and Data Stewards to understand what critical metrics and data fields are needed for Metric Dashboards.
  • Creation of technical requirements for implementation and maintenance of DQ rules in the Collibra Data Quality (CDQ) platform and KDEs in the Collibra Data Intelligence platform.
  • Partner with Technology and DQ Managers to ensure KDEs are governed by DQ Rules across various DQ dimensions of accuracy, completeness, uniqueness, timeliness, validity and consistency.
  • Support reporting on data quality metrics and trends.
  • Perform data validations and reconciliations to assess validity and effectiveness of the DQ rules, conduct QA and UAT, create and maintain detailed test plans, execute test cases, and test scripts to validate CDQ results.
  • Establish QA governance frameworks with clear accountability, traceability, and reporting across all test cycles.
  • Lead the QA Testing efforts that will design and execute test cases for CyberDW.
  • Identify, report, and track defects as well as participate in creating and maintaining regression test plans, cases, and scripts using Xray in Jira.
  • Design, develop, and test CyberDW ETL feeds focusing on data ingestion and data quality.
  • Create scripts and test plans to ensure that the raw data is transformed, processed, and stored correctly.
  • Work with the development teams and participate in daily scrum calls adhering to an agile SDLC framework.
  • Drive the adoption of test management tools to ensure complete coverage and defect traceability.
  • Leverage modern test harnesses, CI/CD pipelines, and AI-driven test frameworks for smarter, faster, and more reliable validation.
  • Utilize production job scheduling systems for the DQ and QC processes and adhere to standards around the daily scheduling and batch monitoring of production jobs.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service