Software Engineer [Multiple Positions Available]

JPMorgan Chase & Co.Jersey City, NJ
2h

About The Position

Duties: Ensure all production jobs run within the Service Level Agreement (SLA) every day. Support any major platform outages and upgrades and bring the applications back up to date by catching up all batch jobs. Perform issue triage involving multiple platforms and application systems, including production code deployment with a multilayer and robust review for the change-under-change advisory board. Perform data catch up if any issues occur to ensure the downstream application does not have any impact. Perform code reviews and support production issue resolution, working with direct users and power users. Automate metadata consolidation where metadata is stored on various platforms, ensuring the data from different systems is loaded in a single database. Create a Tableau dashboard to capture the data movement status in the Cloud from legacy platforms using various legacy tools. Review data reconciliation and data validation status by reading data from various reconciliation platforms developed within the firm.

Requirements

  • Bachelor's degree in Computer Engineering, Software Engineering, Information Technology, or related field of study plus five (5) years of experience in the job offered or as Software Engineer, Technical Lead, Ab Initio/Software Developer, Systems Engineer, or related occupation.
  • Working on all components of Ab Initio GDE including rollup, join, scan, reformat, partitioning, MFS, sort, and dedup using GDE version 3.3 to 4.0
  • building, enhancing, conducting unit testing and performance review, and promoting the Ab Initio ETL code from DEV to higher environments including PROD while minimizing impacts to downstream testing and users
  • using Ab Initio GDE for graph analysis and working with document mapping
  • using performance optimization techniques including parallel processing, efficient data partitioning, and resource management for optimization
  • writing business rules in Ab Initio BRE to manage and execute complex decision logic and business rules independently from application code
  • using Informatica PowerCenter to perform ETL processing for large volumes of data using components including sorter, joiner, lookup, and expression transformation, and parsing non-conventional relational databases including XML and JSON
  • working with Informatica Repository Manager, PowerCenter Designer, Workflow Manager and Workflow Monitor
  • using Informatica BDM/DEI to handle large volumes of structured and unstructured data across on-premise and cloud environments using components including sorter, joiner, and expression transformation, and parsing non-conventional relational databases including XML and JSON
  • working with the Metadata Hub to maintain data quality, streamline data workflows, facilitate data discovery, and establish proper lineage for datasets from source to target
  • database management including Teradata and at least one of the following: Oracle or SQL Server
  • working on PL/SQL stored procedures and packages to load data from staging tables to target ETL tables
  • generating test data using the Create Data and Leading Records transformation in Ab Initio
  • identifying and resolving issues using log analysis tools including SQL modularization and debugging techniques including script threading
  • unit testing frameworks for performance tuning and performance review through SQL optimization
  • designing and developing automation scripts using Unix and Python to minimize recurring and time-consuming activities
  • using Unix commands and working with awk and sed
  • debugging shell scripts and enhancing them based on the requirement
  • using scheduling tools including Control-M and AutoSys, Control Center, CA-7 Mainframe Scheduling, and ESP for running jobs for data validation and UAT, monitoring jobs, retrieving logs, and debugging failures
  • working with the Snowflake platform and its processing capabilities on various data types and using the Snowflake Share feature
  • migrating code using tags and PIFs, and working with Air Commands
  • data modeling including changing existing physical and logical models using at least one of the following tools: Erwin, IBM Infosphere, MS Visio, Oracle SQL Data Developer Modeler
  • participating in collaborative reviews and using checklists to ensure code quality using at least one of the following code review tools: Crucible or GitHub
  • translating business requirements into technical specifications using collaboration tools including MS Teams and Zoom for virtual meetings and documentation tools including Confluence and JIRA for capturing requirements and tracking progress
  • ensuring code quality with checklists and guidelines provided by Ab Initio, and following basic standards including MFS, layouts, choice of components, performance, and reusability
  • using techniques for query optimization and performance tuning to reduce the running time of Ab Initio code
  • using production deployment tools such as ServiceNow to create and submit change records and to facilitate the deployment of code to PROD
  • adhering to change management protocols, including Review Board evaluations, and approval processes such as database administration approvals
  • dashboard development using at least one of the following Tableau or Python libraries: Matplotlib or Plotly.

Responsibilities

  • Ensure all production jobs run within the Service Level Agreement (SLA) every day.
  • Support any major platform outages and upgrades and bring the applications back up to date by catching up all batch jobs.
  • Perform issue triage involving multiple platforms and application systems, including production code deployment with a multilayer and robust review for the change-under-change advisory board.
  • Perform data catch up if any issues occur to ensure the downstream application does not have any impact.
  • Perform code reviews and support production issue resolution, working with direct users and power users.
  • Automate metadata consolidation where metadata is stored on various platforms, ensuring the data from different systems is loaded in a single database.
  • Create a Tableau dashboard to capture the data movement status in the Cloud from legacy platforms using various legacy tools.
  • Review data reconciliation and data validation status by reading data from various reconciliation platforms developed within the firm.

Benefits

  • comprehensive health care coverage
  • on-site health and wellness centers
  • a retirement savings plan
  • backup childcare
  • tuition reimbursement
  • mental health support
  • financial coaching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service