Data Recognition Corporation - Maple Grove, MN

posted about 2 months ago

Full-time
Remote - Maple Grove, MN
Educational Services

About the position

The Data Engineer in Test (Quality focus) at Data Recognition Corporation (DRC) is responsible for ensuring the quality and reliability of data delivery pipelines. This role involves collaborating with multiple development teams and stakeholders to implement, validate, and reconcile data changes while delivering high-quality business intelligence (BI) reporting and analytics solutions. The position requires a blend of data engineering and test engineering skills, focusing on operationalizing data quality and supporting critical BI reporting needs.

Responsibilities

  • Implement and configure data/metadata changes and data ETL across various staging and target steps for production data pipelines
  • Design and build test data, test strategies, and test plans to verify data transformations, data quality rules, and data integrations
  • Complete data validations through detailed comparative analyses of data sets from different sources to ensure data consistency and accuracy
  • Perform automated and manual data reconciliation as needed to ensure end-to-end data and BI reporting completeness and fidelity
  • Triage, resolve ETL and data/metadata quality issues, performance bottlenecks, and scalability concerns that may arise, pre or post deploy/release
  • Document validation scenarios and results, defects, and test coverage in fulfillment of data governance and compliance requirements
  • Contribute to team's requirements and gap analyses, work plannings, and deployment orchestrations concerning data quality and reporting deliveries
  • Apply and build upon your experiences in data engineering, data quality management, and data validation best practices

Requirements

  • 3+ years of related experience
  • Ability to communicate requirements and details to non-technical users
  • Comfortable with collaboration and working in a fast-paced environment
  • Solid understanding of relational database fundamentals, SQL, data engineering and coding for various database technologies (such as: MS SQL Server, PostgreSQL, Redshift, DynamoDB, Snowflake, etc.) and methodologies (such as: Relational, Star Schema, NoSQL/XML/JSON, etc.)
  • Building, optimizing database queries and models
  • Building processes that automate/support data transformations, validations and workload orchestrations
  • Familiarity with Shell scripting (such as: Unix Bash, Microsoft PowerShell, etc.)
  • Ability to review and understand business requirements
  • Ability to review and create detailed technical documentation
  • Proficient in creating business process and data flow diagrams using tools (such as: MS Visio and LucidCharts)
  • Familiarity with Microsoft Office, Atlassian, Google suite, and applicable data analytics tools.

Nice-to-haves

  • Familiarity with Web scripting spanning one or more of: HTML, CSS, JavaScript, DevExtreme, HighCharts, etc.
  • Familiarity with programming in one or more of the following: Python, Java, C#/.NET, VB.NET, R
  • Understanding Web Services / REST / GraphQL core concepts
  • Experience working with, configuring or administering BI reporting platforms such as: ThoughtSpot, Tableau, Power BI, Domo, etc.
  • Working with cloud technologies such as AWS, Azure, Google Cloud
  • Familiarity with Agile development methodologies, such as: Kanban, Scrum
  • Understanding of test-driven development and CI/CD deployment pipeline
  • Experience with Open Source work management tools such as: JIRA, Jenkins, and GitHub
  • Experience scripting for API integration testing eg. with Postman
  • Bachelor's or higher degree, preferably in Information Systems, Computer Science, Mathematics, Physics, Engineering or a related discipline
  • Central time zone is preferred
Job Description Matching

Match and compare your resume to any job description

Start Matching
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service