Data Engineer - Enterprise Data Management

Voya FinancialHartford, CT
1dHybrid

About The Position

Together we fight for everyone’s opportunity for a better financial future. We will do this together — with customers, partners and colleagues. We will fight for others, not against: We will stand up for and champion everyone’s access to opportunities. The status quo is not good enough … we believe every individual and every community deserves access to financial opportunities. We are determined to support both individuals and communities in reaching a better financial future. We know that reaching this future depends on our actions today. Like our Purpose Statement, Voya believes in being bold and committed to action. We are committed to a work environment where the differences that we are born with — and those we acquire throughout our lives — are understood, valued and intentionally pursued. We believe that our employees own our culture and have a responsibility to foster an environment where we all feel comfortable bringing our whole selves to work. Purposefully bringing our differences together to positively influence our culture, serve our clients and enrich our communities is essential to our vision. Are you ready to join a company with a strong purpose and a winning culture? Start your Voyage – Apply Now Profile Summary: The Data Engineer is responsible for designing, building, and maintaining scalable data pipelines and systems that deliver trusted data for analysis and product use cases. This role partners with cross-functional teams to understand data needs and implement solutions that support both near-term and long-term objectives. This role requires the ability to contribute to technical design, ensure data quality, and operate with increasing independent and accountability. Profile Description: Develop and maintain batch and streaming data pipelines using modern tools and frameworks. Design transformations, optimize performance, and ensure reliable data delivery. Design and implement scalable and maintainable data models and storage solutions that align with business needs and support efficient querying, analysis, and data integration efforts. Engage in agile best practices, help refine stories, identify dependencies, and proactively raise risks or concerns to ensure work is completed on time or escalated when needed. Implement and enforce data quality controls, validation, and compliance standards across pipelines. Support the deployment, scheduling, and monitoring of data pipelines and workflows to ensure consistent, reliable execution. Maintain comprehensive documentation and advocate for coding standards, best practices, and reusable components. Collaborate regularly with cross-functional teams to clarify data requirements, document assumptions, and deliver high-quality solutions. Communicate clearly during stand-ups, design discussion, and retrospectives. Actively contribute to team code reviews and sharing learnings with peers.

Requirements

  • 2-5 years of experience in data engineering, data modeling, and ETL pipelines
  • Proficient in SQL and Python for creating, improving, and fixing data pipelines
  • Experience with cloud and data platforms, especially Azure and Databricks (Delta Live Tables and Unity Catalog)
  • Strong understanding of tools like SnapLogic, Azure Data Factory, and Jenkins for data integration and orchestration
  • Practical experience with Terraform for infrastructure as code and managing deployment pipelines
  • Experience integrating with APIs.
  • Knowledge of data quality and monitoring tools, particularly Soda or similar
  • Proficient in version control and CI/CD workflows, using tools like GitHub
  • Solid understanding of data modeling principles (e.g., dimensional modeling, normalization)
  • Comfortable working in agile teams, with a proactive approach to planning, organizing tasks, and collaborating

Responsibilities

  • Develop and maintain batch and streaming data pipelines using modern tools and frameworks.
  • Design transformations, optimize performance, and ensure reliable data delivery.
  • Design and implement scalable and maintainable data models and storage solutions that align with business needs and support efficient querying, analysis, and data integration efforts.
  • Engage in agile best practices, help refine stories, identify dependencies, and proactively raise risks or concerns to ensure work is completed on time or escalated when needed.
  • Implement and enforce data quality controls, validation, and compliance standards across pipelines.
  • Support the deployment, scheduling, and monitoring of data pipelines and workflows to ensure consistent, reliable execution.
  • Maintain comprehensive documentation and advocate for coding standards, best practices, and reusable components.
  • Collaborate regularly with cross-functional teams to clarify data requirements, document assumptions, and deliver high-quality solutions.
  • Communicate clearly during stand-ups, design discussion, and retrospectives.
  • Actively contribute to team code reviews and sharing learnings with peers.

Benefits

  • Health, dental, vision and life insurance plans
  • 401(k) Savings plan – with generous company matching contributions (up to 6%)
  • Voya Retirement Plan – employer paid cash balance retirement plan (4%)
  • Paid time off – including 20 days paid time off, nine paid company holidays and a flexible Diversity Celebration Day.
  • Paid volunteer time — 40 hours per calendar year
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service