Specialist, Data Engineer

NationwideColumbus, OH
9hHybrid

About The Position

If you’re passionate about being part of a dynamic organization that enables a Fortune 100 company with nearly $70 billion in annual sales to drive innovation and adopt new technologies that deliver business results, then Nationwide’s Technology team could be the place for you! At Nationwide®, “on your side” goes beyond just words. Our customers are at the center of everything we do and we’re looking for associates who are passionate about delivering extraordinary care. If you are passionate about the importance of data engineering and delivering clean, automated, integrated, and governed data required to fulfill our company’s most critical strategic objectives, then we have a great opportunity for you. As a member of the Enterprise Data Office, the Data Engineer works closely with our Dev Line partners, Data Architects, business stakeholders, platform owners, and a host of other internal actors in the designing, building, and operationalizing of end-to-end data delivery solutions. This position combines hands-on data engineering, along with understanding both the strategic goals and desired capabilities of the business units we support. It involves the development of data pipelines and application architecture spanning the gamut of data ingestion, harmonization, and curation to enable business value realization across numerous consumer tiers. As new solutions are developed as part of Build activities, this role is also critical in working with application teams and other key business and IT stakeholders to aid them in the production hand-off, adoption, and ongoing management of new technologies, processes, and best practices. A successful candidate should possess the following skills and/or experience: Minimum of 3 years performing complex data pipeline development across numerous platforms (e.g. Informatica, Databricks, Snowflake, native Python). Minimum of 2 years development experience utilizing AWS cloud technologies (storage, container, compute, security, and automation). Fluency in Python, SQL, and Unix. High proficiency with generating complex SQL queries is required. Ability to understand and explain complex data integration and extraction / transformation / load processes. Demonstrated proficiency in utilizing AI tools to perform automated code development and testing tasks as part of typical daily work routine. Actively promotes reusable processes, engineering patterns, and work products. Demonstrated success at applying standardized automation, secure data practices, and quality routines to data pipelines and information reporting solutions. Routine development and deployment of application monitoring and alerting capabilities as part of standard pipeline build activities. Extensive experience utilizing DevOps and CI/CD tools as part of regular development activities. Demonstrated history of converting innovative ideas into scalable, production ready solutions. Knowledge of data governance tools, processes and policies, as well as influencing their adoption on projects driven by our IT and business partners. Proven communication skills illustrating ability to effectively interact with people at all levels of business and technology organizations. Demonstrated skill working directly with stakeholders, business partners, SMEs, systems peers, and cross-functional teams to collaborate and gain agreement on target solution designs. Experience providing skill coaching to junior resources, resources transitioning into newer skills and technologies, and mentees. Insurance and Financial domain experience is highly desired. Experience working in an Agile environment using Lean, Kanban and Scrum practices. This role will work a hybrid schedule coming into the Columbus, Ohio office 2 days per week. This role does not qualify for employer-sponsored work authorization. Nationwide does not participate in the Stem OPT Extension program #LI-AZ1 Job Description Summary Nationwide’s industry leading workforce is passionate about creating data solutions that are secure, reliable and efficient in support of our mission to provide extraordinary care. Nationwide embraces an agile work environment and collaborative culture through the understanding of business processes, relationship entities and requirements using data analysis, quality, visualization, governance, engineering, robotic process automation, and machine learning to produce targeted data solutions. If you have the drive and desire to be part of a future forward data enabled culture, we want to hear from you. As a Data Engineer you’ll be responsible for acquiring, curating, and publishing data for analytical or operational uses. Data should be in a ready-to-use form that creates a single version of the truth across all data consumers, including business users, data scientists, and Technology. Ready-to-use data can be for both real time and batch data processes and may include unstructured data. Successful data engineers have the skills typically required for the full lifecycle software engineering development from translating requirements into design, development, testing, deployment, and production maintenance tasks. You’ll have the opportunity to work with various technologies from big data, relational and SQL databases, unstructured data technology, and programming languages.

Requirements

  • Minimum of 3 years performing complex data pipeline development across numerous platforms (e.g. Informatica, Databricks, Snowflake, native Python).
  • Minimum of 2 years development experience utilizing AWS cloud technologies (storage, container, compute, security, and automation).
  • Fluency in Python, SQL, and Unix.
  • High proficiency with generating complex SQL queries is required.
  • Ability to understand and explain complex data integration and extraction / transformation / load processes.
  • Demonstrated proficiency in utilizing AI tools to perform automated code development and testing tasks as part of typical daily work routine.
  • Actively promotes reusable processes, engineering patterns, and work products.
  • Demonstrated success at applying standardized automation, secure data practices, and quality routines to data pipelines and information reporting solutions.
  • Routine development and deployment of application monitoring and alerting capabilities as part of standard pipeline build activities.
  • Extensive experience utilizing DevOps and CI/CD tools as part of regular development activities.
  • Demonstrated history of converting innovative ideas into scalable, production ready solutions.
  • Knowledge of data governance tools, processes and policies, as well as influencing their adoption on projects driven by our IT and business partners.
  • Proven communication skills illustrating ability to effectively interact with people at all levels of business and technology organizations.
  • Demonstrated skill working directly with stakeholders, business partners, SMEs, systems peers, and cross-functional teams to collaborate and gain agreement on target solution designs.

Nice To Haves

  • Experience providing skill coaching to junior resources, resources transitioning into newer skills and technologies, and mentees.
  • Insurance and Financial domain experience is highly desired.
  • Experience working in an Agile environment using Lean, Kanban and Scrum practices.

Responsibilities

  • Provides basic to moderate technical consultation on data product projects by analyzing end to end data product requirements and existing business processes to lead in the design, development and implementation of data products.
  • Produces data building blocks, data models, and data flows for varying client demands such as dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research & exploration
  • Applies secure software and systems engineering practices throughout the delivery lifecycle to ensure our data and technology solutions are protected from threats and vulnerabilities.
  • Translates business data stories into a technical story breakdown structure and work estimate so value and fit for a schedule or sprint is determined.
  • Creates simple to moderate business user access methods to structured and unstructured data by such techniques such as mapping data to a common data model, NLP, transforming data as necessary to satisfy business rules, AI, statistical computations and validation of data content.
  • Assists the enterprise DevSecOps team and other internal organizations on CI/CD best practices experience using JIRA, Jenkins, Confluence etc.
  • Implements production processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Develops and maintains scalable data pipelines for both streaming and batch requirements and builds out new API integrations to support continuing increases in data volume and complexity.
  • Writes and performs data unit/integration tests for data quality
  • With input from a business requirements/story, creates and executes testing data and scripts to validate that quality and completeness criteria are satisfied.
  • Can create automated testing programs and data that are re-usable for future code changes.
  • Practices code management and integration with engineering Git principle and practice repositories.
  • May perform other responsibilities as assigned.

Benefits

  • medical/dental/vision
  • life insurance
  • short and long term disability coverage
  • paid time off with newly hired associates receiving a minimum of 18 days paid time off each full calendar year pro-rated quarterly based on hire date
  • nine paid holidays
  • 8 hours of Lifetime paid time off
  • 8 hours of Unity Day paid time off
  • 401(k) with company match
  • company-paid pension plan
  • business casual attire
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service