Server Jobs

10,000 jobs found — updated daily

Senior Data Engineer - Snowflake Platform (Senior Data Engineer)

SchoolsFirst Federal Credit UnionSacramento, CA
Hybrid

About The Position

Responsible for creating, transforming and expanding the data pipeline to various target destinations for consumption across the Credit Union data architecture. This role will be responsible for the development, maintenance, testing and support for these solutions. The Senior Data Engineer will work to create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements, and identify, design, and implement internal process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability. They will build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using SQL and cloud 'big data' technologies, and manage the Snowflake environment including administration, data ingestion, and security. The role involves collaboration with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs, while ensuring SFFCU and Member data security and compliance with NCUA and regulatory policies. Additionally, the engineer will create data tools for analytics and data scientist team members to enhance analytic capabilities and strive for greater functionality in data ecosystems.

Requirements

  • Bachelor's Degree in CS, MIS or MCDBA certification or equivalent years of experience required
  • 7-10 years experience in data engineering or experience with ETL/ELT and data integration tools required
  • Working knowledge of cloud data platforms, with hands‑on experience using Snowflake for data storage, processing, and security
  • Working knowledge of BI & data visualization tools
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of non-relational database technologies
  • Experience building and optimizing data pipelines, architectures and data sets
  • Experience with data integration tools and ETL/ELT tools, such as QLIK and Boomi
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Strong analytic skills related to working with a variety of datasets
  • Proven success in manipulating, processing and extracting value from large and disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience in project management, including agile methodology and organizational skills
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • Experience with traditional relational SQL database such as SQL Server or Oracle
  • Experience with data pipeline and workflow management tools like QLIK and Boomi
  • Experience with cloud services
  • Experience with object-oriented/object function scripting languages: Python, PowerShell, etc.
  • Strong analytical, problem solving and conceptual skills

Nice To Haves

  • Experience with BI Analytics Frameworks is desired

Responsibilities

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using SQL and cloud ‘big data’ technologies
  • Manage Snowflake environment such as Administration, data ingestion and security
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep SFFCU and Member data secure and compliant to NCUA and regulatory policies
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our analytic capabilities into an innovative industry leader
  • Work with data and analytics experts to strive for greater functionality in our data ecosystems
  • Whenever possible, provide opportunities to serve and teach others
  • Be accountable for staying up to date with data storage technologies in a collaborative environment
  • Performs other duties as assigned
  • Complies with regulatory compliance and assigned training requirements including but not limited to BSA regulations corresponding to their specific job duties

Career Resources

Build a Resume for Server

The resume builder that gets results.

  • Get clear feedback so you look as qualified as you are
  • Align your resume with the job to get further in the process, faster
  • Take the guesswork out of resume writing

Explore Related Job Searches

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service