Principal Data Engineer

Fidelity InvestmentsDurham, NC
6hHybrid

About The Position

Position Description: Designs and delivers data lakes, data warehouses, and report platforms. Develops data and analytics solutions on Snowflake Cloud Platform and provides technical guidance in the implementation and practice of relational database technologies and tools -- Snowflake, Oracle, SQL Server, and PL/SQL. Works with teams to support multiple source databases and builds extract replication metrics using replication and programming tools. Builds automated pipelines to deploy to various environments and services using Amazon Web Services (AWS), Python, Concourse, Jenkins core, and Groovy script.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and five (5) years of experience as a Principal Data Engineer (or closely related occupation) designing and building database and data models in a financial services environment.
  • Or, alternatively, Master’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Principal Data Engineer (or closely related occupation) designing and building database and data models in a financial services environment.
  • Demonstrated Expertise (“DE”) translating business requirements into technical validations -- examining data to determine accuracy, quality, or condition using SQL, Python, Java, and Scala; and automating the Continuous Integration and Continuous Delivery (CI/CD) Rest APIs pipeline for deployments using Stash, GitHub, Jenkins, and uDeploy.
  • DE developing data ingestion frameworks using Python to load structured data from relational databases into a Cloud Software as a Service (SaaS) Data Lake Platform (Snowflake), using AWS, NIFI, Java, Snowflake, and Python.
  • DE developing data masking, reconciliation framework, and comparison dashboard metrics using Bigdata, Python, Java, Snowflake, and Airflow.
  • DE building metadata-driven frameworks for file feed data ingestion to load data into Snowflake using AWS, Python, Java, Snowflake, and Airflow.

Responsibilities

  • Designs, implements, and maintains data structures, batch jobs, and interfaces to external systems.
  • Develops original and creative technical solutions to on-going development efforts.
  • Develops applications for multiple projects supporting several divisional initiatives.
  • Supports and performs all phases of testing leading to implementation.
  • Assists in the planning and conducting of user acceptance testing.
  • Develops comprehensive documentation for multiple applications supporting several corporate initiatives.
  • Responsible for post-installation validation and triaging of any issues.
  • Establishes project plans for projects of moderate scope.
  • Performs independent and complex technical and functional analysis for multiple projects supporting several initiatives.
  • Manages data services hosted on the operational data stores and file-based interfaces.
  • Confers with systems analysts and other software engineers/developers to design systems.
  • Gathers information on project limitations and capabilities, performance requirements, and interfaces.
  • Develops and oversees software system testing and validation procedures, programming, and documentation.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service