Principal Data Engineer

Fidelity InvestmentsWestlake, TX
10hHybrid

About The Position

Position Description: Develops and deploys pipelines, using DevOps and Continuous Integration/Continuous Delivery (CI/CD) best practices within a Cloud native infrastructure. Provides data analysis on complex systems analysis projects, often across subsystems and companies, and in a matrix organization. Develops Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) pipelines to move data to and from Snowflake data store, using Python and Snowflake SnowSQL. Establishes CD pipelines to deploy tools and practices, using GitHub, Jenkins, Stash, and Artifactory. Supports the creation, maintenance, and compliance of Agile/SCRUM development standards and guidelines. Performs data manipulation using Amazon Web Services (AWS). Performs data mining and data analysis, using Oracle, SQL server, and NoSQL database.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and five (5) years of experience as a Principal Data Engineer (or closely related occupation) analyzing, designing, and building ETL processes, and using data integration solutions to ensure reliable and scalable data management across operational or analytical capability platforms, using Snowflake, Kafka, and AWS.
  • Or, alternatively, Master’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Principal Data Engineer (or closely related occupation) analyzing, designing, and building ETL processes, and using data integration solutions to ensure reliable and scalable data management across operational or analytical capability platforms, using Snowflake, Kafka, and AWS.
  • Demonstrated Expertise (“DE”) architecting, designing, and developing microservices-based Application Programming Interfaces (APIs) and testing automation frameworks, using Python, Java, Swagger, Amazon Elastic Kubernetes Service (EKS), and serverless technologies.
  • DE developing CI/CD pipelines in a hybrid on-prem and Cloud environment (AWS) to deliver changes in production and non-production environments, using DevOps tools (GitHub, Jenkins, Maven, and Terraform).
  • DE analyzing, designing, developing, and testing ETL batch processing application for data warehouse and Online Transaction Processing (OLTP) based systems, using AWS, Snowflake, Oracle, and PostgreSQL PL/SQL.
  • DE performing Logical and Physical Data Modelling for relational databases – SQL Server, Postgres, Oracle, and Snowflake; and optimizing database and query performance by implementing appropriate data types, indexing strategies, and partitioning techniques based on data access patterns.

Responsibilities

  • Designs, implements, and maintains data structures, batch jobs, and interfaces to external systems.
  • Develops original and creative technical solutions to on-going development efforts.
  • Develops applications for multiple projects supporting several divisional initiatives.
  • Supports and performs all phases of testing leading to implementation.
  • Assists in the planning and conducting of user acceptance testing.
  • Develops comprehensive documentation for multiple applications supporting several corporate initiatives.
  • Responsible for post-installation validation and triaging of any issues.
  • Establishes project plans for projects of moderate scope.
  • Performs independent and complex technical and functional analysis for multiple projects supporting several initiatives.
  • Manages data services hosted on the operational data stores and file-based interfaces.
  • Confers with systems analysts and other software engineers/developers to design systems.
  • Gathers information on project limitations and capabilities, performance requirements, and interfaces.
  • Develops and oversees software system testing and validation procedures, programming, and documentation.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service