Sr. Databricks Data Engineer, Vice President, Hybrid

State StreetPrinceton, NJ
Hybrid

About The Position

The person in this role will have extensive technical, people and process skillsets that enable us to successfully deliver on this initiative. The position requires a strong background in application architecture and development, a very good understanding of leading IT technologies. The person in this role will be hands-on, will contribute to the development, enhancement and maintenance of platforms as a member of an agile scrum team. This is an opportunity to work within a very strong development team and help to develop challenging new products in the financial space using cutting edge technology.

Requirements

  • Experience in micro-services architecture and understanding of Cloud Computing is highly desirable (Azure preferred)
  • Proficient working knowledge on Data Warehousing platforms as Databricks, Delta lake and Apache spark
  • Hands on development experience in Databricks SQL and Scala
  • Strong understanding of Databricks platform including clusters, jobs and other resources
  • Monitor and troubleshoot data pipelines to identify and resolve issue
  • Implement Data quality checks and validations to ensure data accuracy
  • Experience performing data analysis and data exploration
  • Strong hands-on experience in troubleshooting Devops pipelines (ADO and Harness)
  • Experience working in a multi-developer environment, using version control (i.e. Git)
  • Strong critical thinking, communication, and problem-solving skills
  • Experience in handling semi-structured data (Avro, JSON and XML)
  • Basic knowledge on shell scripting, UNIX, TOAD, SQL developer
  • Bachelor’s Degree level qualification in a computer or IT related subject
  • 10+ years of overall Bigdata data pipeline experience
  • 5+ years of Databricks hands on experience
  • 5+ years of experience on cloud-based development including Azure Services, Azure Devops, Kubernetes, Docker
  • Communicate effectively in a professional manner both written and orally
  • Team player with a positive attitude enthusiasm initiative and self-motivation
  • Ability to multi-task energetic fast learner & problem solver
  • Experience of working in the financial industry
  • Experience with agile development methodology

Nice To Haves

  • Experience on Snowflake is plus

Responsibilities

  • Design, build and test end to end data pipeline including data ingestion (streaming, events, and batch) on cloud-based infrastructure using Azure, Cosmos DB
  • Develop and implement ETL processes for data ingestion, transformation and loading data into data lakes
  • Work extensively on Databricks and data warehousing concepts
  • Design & develop custom high throughput and configurable frameworks/libraries
  • Ability to drive change through collaboration, influence and demonstration of POCs
  • Responsible for all aspects of the software development lifecycle, including design, coding, integration testing, deployment, and documentation
  • Work collaboratively within an agile project team
  • Follow best practices and coding standards
  • Grow your personal skillset

Benefits

  • our retirement savings plan (401K) with company match
  • insurance coverage including basic life, medical, dental, vision, long-term disability, and other optional additional coverages
  • paid-time off including vacation, sick leave, short term disability, and family care responsibilities
  • access to our Employee Assistance Program
  • incentive compensation including eligibility for annual performance-based awards (excluding certain sales roles subject to sales incentive plans)
  • eligibility for certain tax advantaged savings plans
  • flexible Work Programs
  • development programs and educational support
  • paid volunteer days
  • matching gift programs
  • access to employee networks
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service