Fidelity Investments-posted about 1 month ago
Full-time • Mid Level
Hybrid • Westlake, TX
5,001-10,000 employees
Securities, Commodity Contracts, and Other Financial Investments and Related Activities

Produces scalable, resilient, Cloud-based systems design using multiple methodologies -- data warehousing, data visualization, and data integration. Simplifies Online Transaction Processing (OLTP) using relational database technologies (Oracle SQL and PL/SQL) and Snowflake. Constructs and compares models using data modelling tools and data ingestion tool sets, including Apache NiFi and Kafka. Sets up reliable infrastructure to perform data-related tasks, particularly with Kafka, to stream analytics. Writes SQL queries in Oracle/Snowflake and performs performance optimization for large datasets. Develops ETL/ELT pipelines to move data to and from Snowflake data store using Python, AWS and Snowflake. Works closely with the business units and architects to gather requirements, plan, design, develop and deploy on-premises and cloud-based applications. Designs, develops and modifies software systems to predict and measure outcomes and consequences of design, using scientific analysis and mathematical models.

  • Determines system performance standards.
  • Assists in establishing and maintaining industry standards in systems and security.
  • Monitors functioning of equipment to ensure system operates in conformance with specifications.
  • Crafts and implements operational data stores and data lakes in a production environment.
  • Analyzes information to determine, recommend, and plan installation of a new system or modification of an existing system.
  • Confers with systems analysts and other software engineers/developers to design systems and obtain information on project limitations and capabilities, performance requirements, and interfaces.
  • Develops and maintains software system testing and validation procedures, programming, and documentation.
  • Analyzes business requirements and delineates possible roadmaps and milestone plans to achieve the desired strategic initiatives.
  • Collaborates with various business units to fulfill their needs as well as their customer's, provides technical support such as application and framework development and data management solution and implementation.
  • Develops ingestion and transformation frameworks to establish data pipelines that can collect logs and enable metadata for the consumption layer.
  • Provides exploratory analysis and framework development, product development and enhancements, and platform and infrastructure solutions and support.
  • Delivers actionable insights to various business units by way of data convergence, establishes lower costs for innovation.
  • Bachelor's degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Information Science, Mathematics, Physics, or a closely related field and five (5) years of experience as a Principal Data Engineer (or closely related occupation) developing legal, compliance, risk, and security applications using AWS Cloud Platforms, Python, and Confluent Kafka to support analytical platforms in a financial services environment.
  • Or, alternatively, Master's degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Information Science, Mathematics, Physics, or a closely related field and three (3) years of experience as a Principal Data Engineer (or closely related occupation) developing legal, compliance, risk, and security applications using AWS Cloud Platforms, Python, and Confluent Kafka to support the analytical platforms in a financial services environment.
  • Demonstrated Expertise ("DE") building data pipelines for feature building that enables detection, prevention, and surveillance of financial crimes, using AWS Cloud Platform, Snowflake, Python, and Confluent Kafka.
  • DE designing, building, and deploying AI/ML model engineering pipelines using AWS Kubernetes, Jenkins Core, Stash, Artifactory, ModelOp, and Docker to support multiple legal, risk, and compliance models.
  • DE implementing data components (databases, tables, views, types, stored procedures, functions, roles, and queries) for legal, compliance, risk, and security applications on relational databases (DB2, Oracle, and Sybase), using SQL and PL/SQL queries.
  • DE designing and developing event-based system integration frameworks using AWS Lambda, Confluent Kafka, SQS, Control-M, and Spring Schedulers for legal, compliance, risk, and security applications.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service