Fidelity Investments-posted about 1 month ago
Full-time • Mid Level
Hybrid • Westlake, TX
5,001-10,000 employees
Securities, Commodity Contracts, and Other Financial Investments and Related Activities

Designs, implements, and maintains the organization's database systems. Ensures the scalability, reliability, performance, and security of databases by performing database tuning and optimization techniques. Incorporates and streamlines test automation into software application builds, using Continuous Integration/Continuous Delivery (CI/CD) pipeline tools -- Jenkins and Maven. Writes and optimizes SQL queries and stored procedures, using SQL, PL/SQL, Python, and shell scripting languages for development of databases. Develops applications in a Cloud environment using AWS. Builds reports and dashboards in Oracle Business Intelligence tools and PowerBI. Creates and maintains database schema and objects (tables, views, indexes, and triggers) for easy access of database systems -- Oracle and Snowflake. Enhances the data integration platform using Extract, Transform, Load (ETL) technologies -- Informatica and Snap Logic. Implements and supports databases in an Amazon Web Services (AWS) Cloud Infrastructure environment and in AWS managed Databases.

  • Monitors and optimizes database performance by implementing various database designs and modelling techniques.
  • Designs and implements database systems to support business requirements.
  • Translates the vision for divisional initiatives into business solutions by developing complex or multiple software applications.
  • Defines and implements application-level architecture.
  • Develops comprehensive documentation for multiple applications or subsystems.
  • Establishes full project life cycle plans for complex projects across multiple platforms.
  • Advises senior management on technical strategy and risk management strategies for projects.
  • Mentors junior team members.
  • Maintains the performance, availability, and security of the databases.
  • Troubleshoots and resolve issues with databases and related systems, by continuously monitoring and optimizing database performance.
  • Confers with data processing or project managers to obtain information on limitations or capabilities for data processing projects.
  • Confers with systems analysts and other software engineers/developers to design systems and to obtain information on project limitations and capabilities, performance requirements, and interfaces.
  • Develops software system testing and validation procedures, programming, and documentation.
  • Bachelor's degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and five (5) years of experience as a Principal Data Engineer (or closely related occupation) developing and supporting data integration and reporting platforms in a financial services environment.
  • Or, alternatively, Master's degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Principal Data Engineer (or closely related occupation) developing and supporting data integration and reporting platforms in a financial services environment.
  • Demonstrated Expertise ("DE") implementing Cloud technologies using Oracle Analytics Cloud (OAC); delivering real time reports with actionable insights to stakeholders using advanced BI tools (Oracle Analytics Server (OAS), Oracle Business Intelligence (OBIEE), and PowerBI), to create dynamic, user-driven reports for multidimensional data analysis and online analytical processing (OLAP).
  • DE performing data engineering, migration, and modelling in a Cloud based data warehouse platform, using Snowflake, Python, Snaplogic, Apache Airflow, and AWS S3 containers (staging layer); developing code to process large datasets in relational environments and improving data workflows and data availability for business operations, using Informatica and PL/SQL.
  • DE designing and developing data pipelines for data warehouses (including extract, transform, load (ETL) from VSAM and DB2), using Informatica and zNFS in Agile frameworks (Scrum and Kanban), within a financial services domain; integrating data pipelines with external PL/SQL and UNIX Shell scripting; creating and managing batch jobs, using Control-M and Informatica servers; and identifying bottlenecks in ETL packages and providing best suited solutions.
  • DE applying DevOps principles throughout the Software Development Life Cycle (SDLC); building and promoting CI/CD pipelines to automate deployments, using Jenkins, Jenkins Core, Stash, Bitbucket, and GitHub; and improving system reliability and supporting continuous delivery of data engineering solutions, using Cloud platforms (AWS, Azure, IBM uDeploy, and Ansible).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service