Lead Software Engineer, Java/Spark/AWS

JPMorgan Chase & Co.Jersey City, NJ
$152,000 - $215,000

About The Position

We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Corporate Technology Office, you will lead a technical area and drive impact across teams, technologies, and projects spanning multiple departments. You will leverage your deep expertise in software engineering, applications, technical processes, and product management to deliver complex projects and initiatives.

Requirements

  • Formal training or certification on software engineering concepts and 5+ years applied experience
  • 5+ years of hands-on professional experience in one or more programming language(s), including Java or Python for data engineering tasks
  • Hands-on experience with Databricks (Spark, Delta Lake, notebooks, job orchestration), AWS data services(EMR, Athena, Glue, S3)
  • Hands-on experience utilizing Apache Spark for large-scale data processing, including developing and optimizing data pipelines, performing real-time and batch analytics, and leveraging Spark’s libraries for machine learning and data transformation to drive actionable business insights.
  • Experience of leveraging AI tools for Developer productivity increase (SDD, Agentic AI, Skills, Copilot, Claude Code etc.)
  • Experience with modern monitoring and logging tools (e.g. Dynatrace, Splunk, Grafana, Prometheus).
  • Proficiency in all aspects of the Software Development Life Cycle with familiarity on CI/CD, DevOps, and infrastructure-as-code tools in cloud environments
  • Proven leadership experience in leading and mentoring varying levels of software engineers

Nice To Haves

  • Application development experience in delivering complicated enterprise Investment Banking application for Market Surveillance, or Investment Banking Front-office Trading Systems or Analytics Systems in FX, Commodities, Equities and Equities Derivatives domains
  • Familiarity with SpringBoot based microservices architecture and RESTful API development.
  • Experience in Container technologies (i.e. Kubernetes and Docker)
  • Experience in Kafka streaming
  • Financial Products knowledge of Futures & Options, FX, Commodities, Equities and Equities Derivatives, as well as trade lifecycles and/or order workflow

Responsibilities

  • Design, build, maintain and optimize robust ETL data pipelines using Databricks (Spark, Delta Lake, Unity Catalog) and ensure efficient ingestion, transformation, and storage
  • Collaborate with data product owner, business stakeholders, and ensure best practices in data engineering, software engineering and resilient cloud architecture
  • Architect and implement data lake and data warehouse solutions leveraging AWS services (S3, Glue, SQS, SNS, Lambda, EMR, etc.)
  • Collaborate with cross team to propose and build new solution for supporting overall application platform by the means of observability, orchestration, resiliency, developer experience, automation.
  • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems
  • Lead and mentor a team of engineers, providing technical guidance and code reviews
  • Monitor, troubleshoot, and tune data pipelines and cloud resources for optimal performance, reliability, and cost efficiency
  • Promote reusability across data pipelines and operational simplicity by introducing good observability standards

Benefits

  • comprehensive health care coverage
  • on-site health and wellness centers
  • a retirement savings plan
  • backup childcare
  • tuition reimbursement
  • mental health support
  • financial coaching
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service