Senior Data Engineer (Temp to Hire)

National Guardian Life Insurance CompanyMadison, WI
Hybrid

About The Position

Since 1909, National Guardian Life Insurance Company (NGL) has been one of America’s most successful and highly rated independent life insurance companies. They specialize in a suite of innovative products for life’s journey, providing financial stability, guidance, and peace of mind. NGL's Core Values—integrity, dependability, collaboration, compassion, and growth—form the foundation of their company, guiding interactions with policyholders, partners, funeral homes, and each other. NGL is committed to creating an inclusive, welcoming environment where diversity is celebrated, and employees are encouraged to be their authentic selves. They offer Employee Resource Groups for professional and personal development. The Senior Data Engineer works in an Agile team environment, responsible for developing and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. This role involves providing thought leadership and hands-on effort to support software developers, database architects, data analysts, and data scientists on data initiatives, ensuring optimal data delivery architecture. Additionally, the Senior Data Engineer assists with the development and maintenance of analytic tools, data visualizations, and their supporting platforms, while also offering thought leadership within the Data Science Team. The position includes training and developing junior engineers and requires an understanding of the Company's strategic business goals, processes, and solution requirements to develop effective technical solutions.

Requirements

  • Ability to understand high-level architecture and design content, and how it relates to implementation deployment
  • Advanced working Python and PySpark knowledge and experience working with AWS services such as Glue, Lambda, SQS, SNS, RDS, Redshift, Athena, DynamoDB, and S3 to support data transformation, data structures, metadata, dependency and workload management
  • Strong knowledge of terraform including modules
  • Strong knowledge of Github (or version control equivalent) including opening and reviewing pull requests, branching strategies, and working in a collaborative environment
  • Experience building and optimizing data pipelines, architectures and data sets
  • Strong understanding of DevOps practices and infrastructure automation
  • Working knowledge of message queuing, stream processing, and highly scalable data stores
  • Successful history of manipulating, processing and extracting value from large disconnected datasets
  • Ability to perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Advanced knowledge and understanding of relevant business and system operations, policies, and procedures
  • Strong project management and organizational skills
  • Proven abilities to take initiative and be innovative
  • Experience supporting and working with cross-functional teams in a dynamic Agile and Scrum environment
  • Strong technical and non-technical communication (verbal and written) and interpersonal skills
  • Excellent organizational skills and time/priority management
  • Bachelor's degree in Computer Science, Information Technology/Systems, or other related field
  • A minimum of five years of experience with Python and building data pipelines
  • A minimum of three years of experience working with AWS services with a focus on data and working with infrastructure as code (IAC)
  • A minimum of one to two years of experience building or maintaining CI/CD pipelines
  • A minimum of four years of applying Agile methodology (Scrum and/or Kanban, Test Driven Development)

Nice To Haves

  • AWS certifications (e.g., AWS Certified Data Analytics - Specialty)

Responsibilities

  • Assembles large, complex data sets that meet functional / non-functional business requirements
  • Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Builds the AWS infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using services such as Python, PySpark, EC2, S3, Data Lake, Glue jobs, Lambda functions, and Delta Lake
  • Implement infrastructure as code (IaC) practices to automate deployment and management of data services
  • Contribute to the continuous integration and continuous deployment (CiCd) process
  • Builds analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Develops in-depth understanding of data environment and leverages knowledge to build robust, scalable solution
  • Develops subject matter expertise (Insurance data and processes, operational systems) and applies to development of solutions
  • Collaborates with other developers to transform backlog items into high-functioning, well-designed, testable and efficient code
  • Writes and maintains technical documentation to describe application logic, coding, testing, changes, history and corrections
  • Participates in and perform code/design reviews; strives for continuous improvement of code quality and development practices
  • Implements AWS resources to build out cloud data architecture
  • Applies knowledge of Microservices and Terraform infrastructure to create efficient, automated and loosely coupled architectures
  • Performs routine maintenance and upgrades to make systems more secure and efficient, and to adapt them to any new requirements
  • Assists in the maintenance of NGL’s database and analytics systems
  • Anticipates system/application challenges and proposes solutions; recommend improvements to existing software as necessary
  • Troubleshoots and resolves issues for both internal and third-party business applications
  • Assists with issue identification, investigation, and resolution process of support incidents
  • Provides off-hours support of scheduled production processing and system maintenance when necessary
  • Shares ownership of the solution deployment, testing, quality, monitoring and operational excellence with the rest of the Agile team
  • Participates in regular team and stakeholder meetings
  • Participates in code reviews and mentor junior engineers
  • Collaborates with cross-functional teams to understand data requirements and translate them into scalable solutions
  • Continually develop skill-sets and abilities to keep them relevant, current and applicable to NGL's current and future needs
  • Follows software development life cycle and quality assurance best practices and governance
  • Ensures compliance with security and privacy standards
  • Performs other duties and responsibilities as needed

Benefits

  • NGL's Core Values – integrity, dependability, collaboration, compassion and growth are a foundation of our company and help to build on the interactions we have with our policyholders, partners, funeral homes and each other
  • We believe in creating an inclusive, welcoming environment for all where diversity is celebrated, and everyone is encouraged to live their best, most authentic self
  • We offer Employee Resource Groups for employees to get involved, learn, network, and offer professional and personal development opportunities
  • NGL is committed to providing reasonable accommodations to qualified individuals with disabilities in the recruitment process
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service