Director of Software Engineering [Multiple Positions Available]

JPMorgan Chase & Co.•Wilmington, DE
20h

About The Position

Duties: Lead data engineering delivery roadmap for two products. Manage the migration of scrambled model training data/users from MTD to AWS, providing them all required tools/utilities so that other forecasting model training users can directly operate off AWS independently. Spearhead the card forecasting model serving migration to cloud. Run an initiative to deliver Tableau reports at scale on the public cloud for stakeholders during CCAR cycles. Run an initiative to have all loss forecasting models be cross-resilient and operate in AWS East/West regions. QUALIFICATIONS: Minimum education and experience required: Bachelor's degree in Computer Engineering, Computer Science, Information Technology, Data Analytics, Data Engineering, or related field of study plus ten (10) years of experience in the job offered or as Director/Manager of Software Engineering, Software Engineer/Developer, Application Architect, Technology Analyst, or related occupation. Skills Required: This position requires seven (7) years of experience with the following skills: using Abinitio, Unix scripts, SQL, and PL SQL to develop and optimize ETL pipelines and real-time streaming solutions; using SQL to design and optimize complex queries, manage Teradata and DB2 relational databases, and implement data models to support business intelligence and reporting needs; designing and implementing comprehensive data architecture solutions while focusing on data modeling, integration, and governance to ensure data quality, accessibility, and scalability across enterprise; data integration, transformation, and loading to support enterprise data warehousing and analytics solutions; and implementing Agile, Waterfall, and Hybrid SDLC methodologies, including Scrum and Kanban, to enhance team collaboration, streamline project delivery, and adapt to changing requirements. This position requires four (4) years of experience with the following skills: using Apache Spark and Apache Kafka to develop and optimize data pipelines and real time streaming solutions; using Java, J2EE, Python, and Scala programming to develop scalable and high- performance applications; utilizing frameworks such as SpringBoot and Spring to streamline development processes; designing and implementing RESTful APIs while focusing on scalable architecture, ensuring efficient data exchange using JSON, and ensuring seamless integration with client applications; using SQL to design and optimize complex queries, manage Teradata and DB2 relational databases, and implement data models to support business intelligence and reporting needs; deploying and managing containerized applications using Kubernetes, specifically configuring clusters and optimizing resource utilization for scalable and resilient cloud-native environments; designing and implementing comprehensive data architecture solutions while focusing on data modeling, integration, and governance to ensure data quality, accessibility, and scalability across enterprise; data integration, transformation, and loading to support enterprise data warehousing and analytics solutions; using Apache Maven for project management and build automation, including dependency management, lifecycle management, and integration with continuous integration tools to streamline the development process; leveraging AWS Cloud Services, including EC2, S3, RDS, and Lambda, to design and implement scalable, secure, and cost-effective cloud solutions that meet diverse business needs; using MongoDB and Hbase for developing and managing NoSQL databases, particularly focusing on schema design, indexing, and optimizing queries to support high-performance and scalable applications; and using JUnit for unit testing Java applications, focusing on test-driven development (TDD) practices to ensure code quality, reliability, and maintainability.

Requirements

  • Bachelor's degree in Computer Engineering, Computer Science, Information Technology, Data Analytics, Data Engineering, or related field of study
  • Ten (10) years of experience in the job offered or as Director/Manager of Software Engineering, Software Engineer/Developer, Application Architect, Technology Analyst, or related occupation.
  • Seven (7) years of experience with using Abinitio, Unix scripts, SQL, and PL SQL to develop and optimize ETL pipelines and real-time streaming solutions
  • Seven (7) years of experience with using SQL to design and optimize complex queries, manage Teradata and DB2 relational databases, and implement data models to support business intelligence and reporting needs
  • Seven (7) years of experience with designing and implementing comprehensive data architecture solutions while focusing on data modeling, integration, and governance to ensure data quality, accessibility, and scalability across enterprise
  • Seven (7) years of experience with data integration, transformation, and loading to support enterprise data warehousing and analytics solutions
  • Seven (7) years of experience with implementing Agile, Waterfall, and Hybrid SDLC methodologies, including Scrum and Kanban, to enhance team collaboration, streamline project delivery, and adapt to changing requirements
  • Four (4) years of experience with using Apache Spark and Apache Kafka to develop and optimize data pipelines and real time streaming solutions
  • Four (4) years of experience with using Java, J2EE, Python, and Scala programming to develop scalable and high-performance applications
  • Four (4) years of experience with utilizing frameworks such as SpringBoot and Spring to streamline development processes
  • Four (4) years of experience with designing and implementing RESTful APIs while focusing on scalable architecture, ensuring efficient data exchange using JSON, and ensuring seamless integration with client applications
  • Four (4) years of experience with using SQL to design and optimize complex queries, manage Teradata and DB2 relational databases, and implement data models to support business intelligence and reporting needs
  • Four (4) years of experience with deploying and managing containerized applications using Kubernetes, specifically configuring clusters and optimizing resource utilization for scalable and resilient cloud-native environments
  • Four (4) years of experience with designing and implementing comprehensive data architecture solutions while focusing on data modeling, integration, and governance to ensure data quality, accessibility, and scalability across enterprise
  • Four (4) years of experience with data integration, transformation, and loading to support enterprise data warehousing and analytics solutions
  • Four (4) years of experience with using Apache Maven for project management and build automation, including dependency management, lifecycle management, and integration with continuous integration tools to streamline the development process
  • Four (4) years of experience with leveraging AWS Cloud Services, including EC2, S3, RDS, and Lambda, to design and implement scalable, secure, and cost-effective cloud solutions that meet diverse business needs
  • Four (4) years of experience with using MongoDB and Hbase for developing and managing NoSQL databases, particularly focusing on schema design, indexing, and optimizing queries to support high-performance and scalable applications
  • Four (4) years of experience with using JUnit for unit testing Java applications, focusing on test-driven development (TDD) practices to ensure code quality, reliability, and maintainability.

Responsibilities

  • Lead data engineering delivery roadmap for two products.
  • Manage the migration of scrambled model training data/users from MTD to AWS, providing them all required tools/utilities so that other forecasting model training users can directly operate off AWS independently.
  • Spearhead the card forecasting model serving migration to cloud.
  • Run an initiative to deliver Tableau reports at scale on the public cloud for stakeholders during CCAR cycles.
  • Run an initiative to have all loss forecasting models be cross-resilient and operate in AWS East/West regions.

Benefits

  • comprehensive health care coverage
  • on-site health and wellness centers
  • a retirement savings plan
  • backup childcare
  • tuition reimbursement
  • mental health support
  • financial coaching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service