Senior Data Engineer

Fidelity InvestmentsWestlake, TX
12hHybrid

About The Position

Position Description: Develops and implements data management and reporting services tools, platforms, and enhancements. Participates in the designing and crafting modern platforms in the Cloud to support company products for global markets. Creates, develops, and maintains various metrics, reports, dashboards, and Business Intelligence (BI) solutions. Builds scalable patterns for data consumption from Cloud-based data lakes by leveraging Cloud technologies and DevOps concepts, including Continuous Integration/Continuous Delivery (CI/CD) pipelines. Develops and maintains comprehensive reporting solutions to provide actionable insights and support data-driven decision-making for the team and stakeholders.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Senior Data Engineer (or closely related occupation) building financial and Medicare data warehouse applications using data modeling and Extract, Transform, and Load (ETL) processing.
  • Or, alternatively, Master’s degree in Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and one (1) year of experience as a Senior Data Engineer (or closely related occupation) building financial and Medicare data warehouse applications using data modeling and Extract, Transform, and Load (ETL) processing.
  • Demonstrated Expertise (“DE”) implementing ETL and Data Integration (DataStage with UNIX); designing and optimizing ETL Workflows, using IBM DataStage, Mainframe files, databases (DB2 and Snowflake), and Flat files; automating and integrating tasks, using UNIX Shell scripting with DataStage jobs; implementing effective data transformation and performance tuning, using DataStage and UNIX; and orchestrating multistage data pipelines to ensure data accuracy and automated data profiling, using DataStage and UNIX.
  • DE designing and implementing data workflows, using Apache Airflow (migrating TWS and Control-M to improve scaling and automation); coordinating the execution of DataStage and Snowflake jobs, using UNIX scripts; and automating Snowflake features (Snowpipe and stored procedures) and integrating them into workflows with task parallelization and dependency tracking, using Directed Acyclic Graphs (DAGs) (to ensure Audit logging).
  • DE implementing Cloud Data Warehousing to store, process, and analyze data, using Snowflake; designing complex SQL and PL/SQL, and optimizing complex data transformations, analytics and aggregations, using Command Table Expression (CTEs), pivoting, and window functions (to handle data processing); developing data ingestion pipelines, using Snowflake with Airflow and ETL scripts; and implementing Change Data Capture (CDC) strategies for incremental loads and optimizing bulk data ingestion on Cloud services (Azure and AWS).
  • DE developing and implementing business analytics and development strategies on FinTech and Medicare data to visualize trends and identify insights, using PowerBI and Tableau; conducting Cognos and MSTR Report Validations, using Cognos Framework Models and MicroStrategy; verifying report data accuracy with SQL on database systems (Netezza, Snowflake, Oracle, and DB2); and executing ETL testing for data accuracy, completeness, and transformation logic, using SQL.

Responsibilities

  • Identifies opportunities for new development within a scalable public Cloud environment.
  • Collaborates and partners with product owners and development teams to translate business requirements into actionable tasks, ensuring cross-functional teams are informed throughout the project lifecycle.
  • Works closely with business partners and other system partners and serves as developer for new tools and implementation projects.
  • Analyzes and manipulates datasets aligned with business requirements, to ensure data integrity, accessibility, and security throughout the entire data lifecycle.
  • Applies data engineering, data warehousing, and analytics technologies in the data application development, data integration, and data pipeline design patterns on a distributed platform.
  • Collaborates with development teams to integrate automated testing into the sprint cycle.
  • Performs continuous testing and validation of new features and functionalities.
  • Analyzes complex requirements, collaborates with developers to design efficient solutions, and creates prototypes to validate proposed solutions and mitigate technical risks.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service