Data Engineer II – Qlik, DBT, Snowflake

First Citizens BankRaleigh, NC
2hRemote

About The Position

This is a remote role that may only be hired in the following locations: NC, TX, AZ The position is responsible for designing, developing, and maintaining data pipelines and infrastructure. This role implements data integration, processing, and analytics solutions in an agile environment while collaborating with cross-functional teams to deliver efficient and scalable data systems.

Requirements

  • Bachelor's Degree and 2 years of experience in Data engineering, database management, or related field OR High School Diploma or GED and 6 years of experience in Data engineering, database management, or related field
  • Hands-on experience in building robust metadata-driven, automated data pipeline solutions leveraging modern cloud-based data technologies, tools for large data platforms.
  • Hands-on experience leveraging data security, governance methodologies meeting data compliance requirements.
  • Experience building automated ELT data pipelines, snowpipe frameworks leveraging Qlik Replicate, DBT Cloud, snowflake with CICD.
  • Hands-on experience building data integrity solutions across multiple data sources and targets like SQL Server, Oracle, Mainframe-DB2, files, Snowflake.
  • Experience working with various structured & semi-structured data files - CSV, fixed‑width, JSON, XML, Excel, and mainframe VSAM.
  • Experience using S3, Lambda, SQS, SNS, Glue, RDS AWS services.
  • Proficiency in Python, Pyspark, advanced SQL for ingestion frameworks and automation.
  • Hands-on data orchestration experience using DBT cloud, Astronomer Airflow.
  • Experience in implementing logging, monitoring, alerting, observability, performance tuning techniques.
  • Implement and maintain sensitive data protection strategies – tokenization, snowflake data masking policies, dynamic & conditional masking, and role‑based masking rules.
  • Strong experience designing, implementing RBAC, data access controls, adopting governance standards across Snowflake and supporting systems.
  • Strong experience in adopting release management guidelines, code deployment to various environments, implementing disaster recovery strategies, leading production activities.
  • Experience implementing schema drift detection and schema evolution patterns.
  • Must have one or more certifications in the relevant technology fields.
  • Team Player: Support peers, team, and department management.
  • Communication: Excellent verbal, written, and interpersonal communication skills.
  • Problem Solving: Excellent problem-solving skills, incident management, root cause analysis, and proactive solutions to improve quality.
  • Partnership and Collaboration: Develop and maintain partnership with business and IT stakeholders
  • Attention to Detail: Ensure accuracy and thoroughness in all tasks.

Nice To Haves

  • Nice to have Financial banking experience.

Responsibilities

  • Data Pipelines - Design, develop, and maintain data pipelines for efficient data processing and integration for real-time and batch-processing. Implement and optimize ETL processes to extract, load, transform and integrate data from various sources. Enhance data flows and storage solutions for improved performance
  • Data Warehousing - Design and implement data warehouse structures. Ensure data quality and consistency within the data warehouse. Apply data modeling techniques for efficient data storage and retrieval
  • Data Governance, Quality & Compliance - Implement data governance policies and procedures. Implement data quality frameworks, standards and documentation. Ensure compliance with relevant data regulations and standards
  • Security & Access Management - Implement data security measures and access controls. Maintain data protection protocols
  • Performance Optimization and Troubleshooting - Analyze and optimize system performance for large-scale data operations. Troubleshoot data issues and implement robust solutions
  • Testing & Automation – Write unit test cases, validate the data integrity & consistency requirements, adopt automated data pipelines using GitLab, Github, CICD tools.
  • Code Deployment & Release Management - Adopt release management processes to promote code deployment to various environments including production, disaster recovery, and support activities.
  • Cross-Functional Collaboration - Collaborate with data scientists, analysts, and other teams to understand and meet data requirements. Participate in cross-functional projects to support data-driven initiatives

Benefits

  • Benefits are an integral part of total rewards and First Citizens Bank is committed to providing a competitive, thoughtfully designed and quality benefits program to meet the needs of our associates. More information can be found at https://jobs.firstcitizens.com/benefits.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service