Data Engineer (Job ID #225)

Cascade Financial Services
13h$117,395Remote

About The Position

Apply an in-depth understanding of data structures and information content. Select, deploy, and manage the systems and infrastructure required for a data processing pipeline in support of the project requirements. Investigate, create, and maintain data flows, data content, data element definitions with a goal of enterprise Master data integration. Determine technical breath in data profiling from different sources and determine whether and how data can support business and data requirements of its intended use. Develop and maintain common business definitions and metadata criteria for consistent metrics reporting across the enterprise. Design the architecture for new data and analytics platform to support analytics and data science and machine learning. Design the data models and data movement processes that support analytics and data science. Recommend and implement patterns and best practices for data engineering. Ensure quality processes are built into the design of the platform. Understand the architectural difference between solution approaches and communicate the advantages/disadvantages of your recommendation to both technical and non-technical audiences. Design and develop analytics and interactive visualizations that create business insights and clearly communicate data and trends. Develop complex SQL queries to obtain data from our source systems. Perform data validation and quality assurance to ensure data integrity and accuracy. Collaborate with IT and business partners to identify data sources and align data domains to authoritative sources of data. Enforce tactical enforcement of Data Governance policies and rules. Research new technologies while keeping up-to-date with technological developments in relevant areas of Data Governance, Master Data, and Data Quality. Position is 100% remote reporting to headquarters in Chandler, AZ.

Requirements

  • Requires a Master’s or Bachelor’s degree in Computer Science, Business, Math, or related
  • 1 year of experience required with a Master’s degree OR 3 years of experience required with a Bachelor’s degree
  • Must have experience in each of the following skills: Data architecture; Information technology; Python, SQL; Docker, CI/CD, and Jenkins; Postgres, Microsoft SQL, SSIS, SSRS; AWS Containers and Servers; Enterprise data management technologies including data warehouses, ETL tools, SQL, and master data management solutions; and Master Data Management, including technical implementation of data cleansing, de-duplication, and data quality practices.

Responsibilities

  • Select, deploy, and manage the systems and infrastructure required for a data processing pipeline
  • Investigate, create, and maintain data flows, data content, data element definitions with a goal of enterprise Master data integration
  • Determine technical breath in data profiling from different sources and determine whether and how data can support business and data requirements of its intended use
  • Develop and maintain common business definitions and metadata criteria for consistent metrics reporting across the enterprise
  • Design the architecture for new data and analytics platform to support analytics and data science and machine learning
  • Design the data models and data movement processes that support analytics and data science
  • Recommend and implement patterns and best practices for data engineering
  • Ensure quality processes are built into the design of the platform
  • Understand the architectural difference between solution approaches and communicate the advantages/disadvantages of your recommendation to both technical and non-technical audiences
  • Design and develop analytics and interactive visualizations that create business insights and clearly communicate data and trends
  • Develop complex SQL queries to obtain data from our source systems
  • Perform data validation and quality assurance to ensure data integrity and accuracy
  • Collaborate with IT and business partners to identify data sources and align data domains to authoritative sources of data
  • Enforce tactical enforcement of Data Governance policies and rules
  • Research new technologies while keeping up-to-date with technological developments in relevant areas of Data Governance, Master Data, and Data Quality

Benefits

  • Medical
  • Dental
  • Vision
  • Life
  • 401K match
  • PTO
  • Sick Time
  • 10 Paid Holidays
  • Remote work opportunities
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service