Software Engineer [Multiple Positions Available]

JPMorgan Chase & Co.Plano, TX
2h

About The Position

Duties: Design, develop and implement software solutions. Solve business problems through innovation and engineering practices. Participate in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules. Identify or troubleshoot application code-related issues. Take active role in code reviews to ensure solutions are aligned to pre-defined architectural specifications. Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows. Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions. QUALIFICATIONS: Minimum education and experience required: Bachelor's degree in Electronics/Electrical Engineering, Computer Science/Engineering, Computer Information Systems, or related field of study plus 5 (five) years of experience in the job offered or as Software Engineer, Java Developer, Product Technical Lead, or related occupation. Skills Required: This position requires five (5) years of experience with the following: creating and implementing microservices architecture using Java and J2EE technologies; leveraging Spring Boot for application setup, Spring Cloud for distributed systems, Spring Data for database interactions, Spring Security for authentication, Spring Integration for messaging, Spring Batch for batch processing, and Spring Kafka for stream processing; incorporating Java concurrency with reactive programming and multithreading; Hibernate for ORM; JVM optimization; memory management; dependency injection for efficient resource management and modular design; coding using lambda; utilizing JSON, XSLT and XML formatted data files and service payloads in Java; developing and deploying containerized APIs using Docker; ensuring image management, high availability, and disaster recovery while configuring Kubernetes secrets, managing replica sets, and utilizing Kubernetes Horizontal Pod Autoscaling for dynamic scaling and robust orchestration; setting up and managing the multiple instances in multi-regions, ephemeral storage, and volumes in AWS EC2 instances; implementing network and application load balancing for traffic distribution and utilizing AWS CloudWatch logs to analyze log groups; creating anomaly detection mechanisms and configuring alarms for proactive monitoring and alerting; executing data modeling, normalization, and performance optimization for Oracle, Cassandra, and AWS DynamoDB databases, while managing data de-duplication, replay, and multi-consumer group onboarding in Kafka, and fine-tuning replication factors and partitions to enhance data distribution; incorporating Datadog Application Performance Monitoring to set up alerts for CPU, memory, and infrastructure metrics, while leveraging Dynatrace to configure, generate, and analyze distributed traces for comprehensive application performance insights and troubleshooting; setting up composite pipelines and executing continuous integration/continuous deployment (CI/CD) processes using Jenkins, Git, including automated rollback capabilities, while employing Terraform to provision and manage AWS infrastructure deployments efficiently; developing dynamic and front- end applications by utilizing ReactJS and Angular for component-based architecture, JavaScript for scripting, HTML and CSS for structuring and styling, and Ajax and jQuery for asynchronous data handling and DOM manipulation; creating robust unit tests using JUnit, Mockito, Sonar to ensure code quality and reliability, while implementing application resiliency and security through methodologies such as Active Directory Federation Services (ADFS), OAuth 2.0, and Ping Identity for secure authentication and authorization; design and automate data processing systems to handle large volumes of data; conducting exploratory data analysis on enterprise databases and applying advanced data processing techniques to extract insights and drive data-driven decision-making; implementing and applying data structures, algorithms, and design patterns for problem-solving; integrating microservices with CyberArk and Kerberos for secure authentication; use Spring, Spring Boot, and Gradle, Maven for streamlined dependency management; deploying AWS infrastructure with Terraform using plan, apply, destroy, and remote state management; developing performance test automation frameworks with JMeter or Blazemeter to ensure high TPS capabilities; and creating custom dashboards, alerts, charts, and email reports in Splunk for comprehensive monitoring and reporting.

Requirements

  • Bachelor's degree in Electronics/Electrical Engineering, Computer Science/Engineering, Computer Information Systems, or related field of study plus 5 (five) years of experience in the job offered or as Software Engineer, Java Developer, Product Technical Lead, or related occupation.
  • Five (5) years of experience with the following: creating and implementing microservices architecture using Java and J2EE technologies
  • Leveraging Spring Boot for application setup, Spring Cloud for distributed systems, Spring Data for database interactions, Spring Security for authentication, Spring Integration for messaging, Spring Batch for batch processing, and Spring Kafka for stream processing
  • Incorporating Java concurrency with reactive programming and multithreading
  • Hibernate for ORM
  • JVM optimization
  • Memory management
  • Dependency injection for efficient resource management and modular design
  • Coding using lambda
  • Utilizing JSON, XSLT and XML formatted data files and service payloads in Java
  • Developing and deploying containerized APIs using Docker
  • Ensuring image management, high availability, and disaster recovery while configuring Kubernetes secrets, managing replica sets, and utilizing Kubernetes Horizontal Pod Autoscaling for dynamic scaling and robust orchestration
  • Setting up and managing the multiple instances in multi-regions, ephemeral storage, and volumes in AWS EC2 instances
  • Implementing network and application load balancing for traffic distribution and utilizing AWS CloudWatch logs to analyze log groups
  • Creating anomaly detection mechanisms and configuring alarms for proactive monitoring and alerting
  • Executing data modeling, normalization, and performance optimization for Oracle, Cassandra, and AWS DynamoDB databases, while managing data de-duplication, replay, and multi-consumer group onboarding in Kafka, and fine-tuning replication factors and partitions to enhance data distribution
  • Incorporating Datadog Application Performance Monitoring to set up alerts for CPU, memory, and infrastructure metrics, while leveraging Dynatrace to configure, generate, and analyze distributed traces for comprehensive application performance insights and troubleshooting
  • Setting up composite pipelines and executing continuous integration/continuous deployment (CI/CD) processes using Jenkins, Git, including automated rollback capabilities, while employing Terraform to provision and manage AWS infrastructure deployments efficiently
  • Developing dynamic and front- end applications by utilizing ReactJS and Angular for component-based architecture, JavaScript for scripting, HTML and CSS for structuring and styling, and Ajax and jQuery for asynchronous data handling and DOM manipulation
  • Creating robust unit tests using JUnit, Mockito, Sonar to ensure code quality and reliability, while implementing application resiliency and security through methodologies such as Active Directory Federation Services (ADFS), OAuth 2.0, and Ping Identity for secure authentication and authorization
  • Design and automate data processing systems to handle large volumes of data
  • Conducting exploratory data analysis on enterprise databases and applying advanced data processing techniques to extract insights and drive data-driven decision-making
  • Implementing and applying data structures, algorithms, and design patterns for problem-solving
  • Integrating microservices with CyberArk and Kerberos for secure authentication
  • Use Spring, Spring Boot, and Gradle, Maven for streamlined dependency management
  • Deploying AWS infrastructure with Terraform using plan, apply, destroy, and remote state management
  • Developing performance test automation frameworks with JMeter or Blazemeter to ensure high TPS capabilities
  • Creating custom dashboards, alerts, charts, and email reports in Splunk for comprehensive monitoring and reporting.

Responsibilities

  • Design, develop and implement software solutions.
  • Solve business problems through innovation and engineering practices.
  • Participate in all aspects of the Software Development Lifecycle (SDLC) including analyzing requirements, incorporating architectural standards into application design specifications, documenting application specifications, translating technical requirements into programmed application modules, and developing or enhancing software application modules.
  • Identify or troubleshoot application code-related issues.
  • Take active role in code reviews to ensure solutions are aligned to pre-defined architectural specifications.
  • Assist with design reviews by recommending ways to incorporate requirements into designs and information or data flows.
  • Participate in project planning sessions with project managers, business analysts, and team members to analyze business requirements and outline proposed solutions.

Benefits

  • We offer a competitive total rewards package including base salary determined based on the role, experience, skill set and location.
  • Those in eligible roles may receive commission-based pay and/or discretionary incentive compensation, paid in the form of cash and/or forfeitable equity, awarded in recognition of individual achievements and contributions.
  • We also offer a range of benefits and programs to meet employee needs, based on eligibility. These benefits include comprehensive health care coverage, on-site health and wellness centers, a retirement savings plan, backup childcare, tuition reimbursement, mental health support, financial coaching and more.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service