Systems Analyst 3 - Data Engineer

Sammons Financial GroupSioux Falls, SD

About The Position

Sammons Financial Group is seeking a Systems Analyst – Data Engineer to design, develop, and implement scalable data and integration solutions that power Life analytics and operations. This role requires strong technical expertise in Snowflake/ similar EDW/Lakehouse platforms, SQL, Python, and cloud-based data platforms (Azure, AWS), with a focus on enabling high-quality data flow, seamless integration, and modern data architecture. The ideal candidate will be both a hands-on technologist and a strategic thinker who ensures solutions are performant, secure, and aligned with business objectives while supporting emerging analytics and AI use cases.

Requirements

  • Strong technical expertise in Snowflake/ similar EDW/Lakehouse platforms, SQL, Python, and cloud-based data platforms (Azure, AWS).
  • Certifications in Snowflake.
  • Experience with Python or Java for data processing and automation.
  • Familiarity with emerging technologies supporting AI/ML and advanced analytics.
  • Experience working in Agile delivery models and cross-functional enterprise environments.
  • College Degree in the field of computer science, information science, management information systems Preferred
  • Minimum 8 years' IT development experience or equivalent Preferred
  • Effective verbal and written communications skills and the ability to communicate with business partners and other IT staff
  • Problem solving skills sufficient to perform research and recommend a proposed solution to problems
  • Able to work on multiple tasks and meet established deadlines
  • Able to effectively direct and coordinate the work of other team members on a project without having HR management responsibility for them
  • Knowledge of computer programming languages as required for the system

Nice To Haves

  • Familiarity with emerging technologies supporting AI/ML and advanced analytics.

Responsibilities

  • Design, develop, and implement scalable data ingestion, integration, and processing pipelines across cloud platforms (Azure, Snowflake/ and similar EDW/Lakehouse platforms , AWS).
  • Develop and manage data orchestration workflows using tools such as Azure Data Factory (ADF), Azure Data Lake (ADLS), dbt, and comparable technologies.
  • Ingest and process large volumes of structured, semi-structured, and unstructured data, including compressed formats (e.g., .tar), and automate extraction, transformation, and loading processes.
  • Design and implement modern data lakehouse architectures, including Iceberg (or similar table formats), to support scalable and high-performance analytics.
  • Develop and maintain data models that accurately represent complex relationships within life insurance and policy administration domains.
  • Integrate enterprise data platforms with internal and external systems (e.g., APIs, Kafka, MuleSoft) to enable real-time and batch data exchange.
  • Collaborate with product owners, architects, analysts, and developers to translate business, functional, and non-functional requirements into scalable technical solutions.
  • Establish and enforce data engineering standards, best practices, and governance controls across ingestion, transformation, and storage layers.
  • Implement data quality validation, reconciliation processes, and error handling to ensure accuracy, consistency, and reliability of data pipelines.
  • Monitor pipeline performance, reliability, scalability, and cost efficiency; recommend and implement improvements.
  • Research and recommend emerging tools, patterns, and technologies to improve data ingestion, processing, integration, and enablement of AI-driven use cases.
  • Build and support backend data systems, APIs, distributed services, and orchestration layers for secure enterprise use.
  • Support enterprise data integration capabilities across Azure, Snowflake, and similar platforms to unify, govern, and operationalize data for analytics and AI applications.
  • Support code versioning, CI/CD pipelines, and controlled deployment of data engineering assets.
  • Troubleshoot pipeline, data, and platform issues across development, testing, and production environments.
  • Provide operational support, including on-call or on-demand support for critical data processes.

Benefits

  • Comprehensive health coverage for you and your family, including Medical, Dental, Vision, HSA & FSA options, and term life insurance.
  • Competitive compensation with a performance-based incentive program tied to clear goals and individual and/or company success.
  • 100% company-funded Employee Stock Ownership Plan (ESOP)
  • Automatic enrollment in our 401(k).
  • Friday afternoons off year-round
  • Generous paid time off
  • Paid holidays
  • Paid development time
  • Tuition reimbursement
  • Professional development opportunities across industry, individual, and leadership programs.
  • Volunteer time off
  • Company nonprofit matching gift program.
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service