Senior Associate: DevOps Engineer

New York LifeNew York, NY
37d$100,000 - $143,000Hybrid

About The Position

As a member of the Technology organization, you will have the opportunity to contribute to transformative initiatives that define New York Life’s digital future. By leveraging advanced technologies — including Generative AI and emerging autonomous data engineering capabilities — you will help enhance productivity, optimize data operations, and deliver seamless digital experiences for clients, agents, and employees. Role Overview: We are currently seeking a DevOps Engineer to join our Field Experience value stream. In this role, you will support the development, integration, and maintenance of data flows across multiple platforms, including Informatica, Java-based services, and cross-system integrations. You will assist in building, troubleshooting, and optimizing data pipelines and integrations that underpin mission-critical Salesforce and enterprise systems used across the value stream. You will also collaborate with senior engineers and the data architect to enhance automation, reliability, and observability for data operations, contributing to the team’s broader modernization and AI-driven evolution. Future Vision At New York Life, we are actively exploring the evolution of autonomous DevOps, where predictive analytics, self-optimizing data pipelines, and AI-driven agents enhance stability, quality, and speed across the enterprise. This role offers the opportunity to be part of that journey — helping shape how agentic AI, observability tooling, and Generative AI transform the next generation of data engineering and integration operations across our Salesforce and non-Salesforce ecosystems.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent hands-on experience.
  • 3–5 years of hands-on experience in ETL development, data integration, Informatica Cloud, Java-based services.
  • Proven experience designing, building, and maintaining complex data pipelines across multiple systems and platforms.
  • Prior experience troubleshooting integrations, optimizing SQL/PL-SQL, and supporting production data workflows.
  • Experience with version control, CI/CD, and enterprise SDLC processes.
  • Experience collaborating in Agile/Scrum environments.
  • Strong foundational knowledge of relational database systems, including understanding of schemas, referential integrity, indexing concepts, normalization/denormalization, and SQL query optimization.
  • Familiarity with Informatica Cloud, SQL/PL-SQL, and ETL concepts.
  • Understanding of Java, Spring, REST/SOAP services, and enterprise integration patterns.
  • Hands-on experience with relational databases such as Oracle or SQL Server, including schema design, performance optimization, indexing strategies, and complex query development.
  • Understanding of XML, JSON, and API integration concepts.
  • Experience with Git and SDLC best practices.
  • Strong analytical skills, with the ability to break down complex issues and learn new technologies quickly.
  • Clear verbal and written communication skills for collaborating with technical and non-technical stakeholders.
  • High level of accuracy in tasks, documentation, and operational support activities.

Responsibilities

  • ETL & Data Pipeline Support
  • Assist in the development, configuration, and maintenance of Informatica Cloud integrations, mappings, workflows, and related assets.
  • Support data extraction, transformation, and loading (ETL) processes using SQL, PL/SQL, and enterprise data platforms (Oracle, SQL Server).
  • Apply strong understanding of relational database structures — including table relationships, keys, indexing, and schema design — to ensure efficient and accurate data movement across systems.
  • Contribute to building and interpreting source-to-target mappings, ERDs, and data lineage documentation to support integration reliability and data quality.
  • Contribute to performance tuning and debugging of data pipelines across multiple integration systems.
  • Assist in validating data accuracy, completeness, and integrity across stages of the ETL process.
  • Integration & Application Support
  • Support REST/SOAP integrations using Java, Spring Framework, Spring Security, and CXF.
  • Assist with troubleshooting and debugging issues in application servers (WebSphere, Tomcat) and MuleSoft integrations (when applicable).
  • Collaborate with engineers to maintain, upgrade, and optimize Java-based web services.
  • Automation & Release Support
  • Participate in automation efforts for data workflows, including batch processes, CI/CD configurations, and integration testing.
  • Contribute to release management activities, including packaging, migration, and version control using GitHub.
  • Operational Reliability
  • Assist in monitoring job schedules, pipeline health, data quality, and system performance.
  • Participate in on-call or off-hours support rotations for operational issues, ensuring timely incident resolution.
  • Documentation & Governance
  • Document data flows, code changes, integration patterns, and operational procedures.
  • Support adherence to SDLC, audit requirements, and enterprise data governance practices.
  • Collaboration
  • Partner with senior team members, architects, analysts, and application owners to ensure smooth operation of enterprise data flows and integrations.
  • Participate in sprint ceremonies, retrospectives, and technical design discussions.
  • Continuous Improvement
  • Contribute to efforts that improve ETL performance, integration reliability, and operational tooling.
  • Participate in root-cause analyses and implement improvements that enhance resiliency.
  • AI & Agentic AI Innovation
  • Investigate opportunities to apply Generative AI, LLMs, and agentic AI to optimize data operations — including automated documentation generation, data-mapping intelligence, integration diagnostics, or predictive anomaly detection.
  • Support proof-of-concept efforts that evaluate autonomous agents for monitoring data pipelines, self-remediating integration failures, or optimizing load performance.
  • Contribute to dashboards or observability tools enhanced with AI-based insights for drift detection, dependency analysis, or data health monitoring.
  • Learning & Development
  • Stay up to date on Informatica Cloud features, Java frameworks, integration best practices, and AI trends in data engineering.
  • Participate in training opportunities to grow your skills across ETL, cloud, Java, and integration platforms.

Benefits

  • We provide a full package of benefits for employees – and have unique offerings for a modern workforce, including leave programs, adoption assistance, and student loan repayment programs.
  • Based on feedback from our employees, we continue to refine and add benefits to our offering, so that you can flourish both inside and outside of work.
  • Recognized as one of Fortune’s World’s Most Admired Companies, New York Life is committed to improving local communities through a culture of employee giving and volunteerism, supported by the Foundation.
  • Pay Transparency
  • Discretionary bonus eligible: Yes
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service