Universities of Wisconsin-posted 10 days ago
$87,000 - $115,000/Yr
Full-time • Mid Level
Madison, WI
5,001-10,000 employees

The Workday Enterprise Solutions Team (WEST) is responsible for the administration and optimization of Workday and related systems for Finance, Human Resources, and Research Administration across the Universities of Wisconsin, while focusing on people, processes, and technology. The WEST Data and Reporting team supports data and reporting needs across the UW universities and the UW System through the development and maintenance of an enterprise data warehouse (the Enterprise Analytics Platform, EAP), Tableau dashboards, Workday reports, BIRT reports, Prism reports, Workday dashboards and discovery boards, and more. We are seeking experienced Data Warehouse Engineers to contribute to the development, support and maintenance of our AWS Redshift-based data lake and corresponding data views. Familiarity with Workday data models is a bonus. The Data Warehouse Engineers will work closely with the Workday integration team, Workday report developers, and customers to gather requirements and create solutions that best serve their needs. To be successful in this position, you will need to be a self-starter who seeks collaboration with customers and peers across the organization. You will need to actively foster a respectful, inclusive, positive, and safe team environment. Since we review overnight processing every morning, a schedule of Monday - Friday beginning no later than 7:30AM CST and ending no earlier than 3:00PM CST is required. There are currently no on-call expectations. However, during a critical outage, off-hours and weekend work may be required.

  • Conducts complex data exploration steps, such as profiling, understanding data quality, binning, pivoting, summarizing, and finding correlations on multiple data types and sources
  • Collaborates closely with data engineers, business analysts, functional teams and development teams on the design, development, testing, and maintenance of robust and scalable data pipelines from Workday to AWS Redshift
  • May provide code review, coaching, and oversight of the activities of lower level data modelers and data curation developers
  • Conducts assessments and/or provides recommendations of source connections options for access and optimal data pipelines for development.
  • Develops and unit tests high-complexity load and/or mappings across a suite of tools and deploys to production
  • Presents the results of highly complex recommendations to address data quality issues, and coaches lower-level staff in performing data quality clean-up initiatives with metrics and reports
  • Develops and documents the inventory of complex data warehouse assets including adding descriptions and making them discoverable for business use
  • Designs, and/or builds, tests, and deploys the data cleansing, integration, and transformation of more complex data in accordance with the defined target data model, based on enterprise data definitions and quality measures provided by data stewards
  • Designs and/or builds, tests, and deploys conceptual, logical, and physical data models according to specifications and standards for document naming, security, lifecycle, and retention architectures.
  • May provide architecture options analysis
  • Maintains understanding of the data landscape within Workday and the Enterprise Analytics Platform (EAP) to ensure that the data objects and design efficiently and effectively support business needs
  • May consult with data subject matter experts to document and address data quality issues and ensure data is assured.
  • May support data quality clean-up initiatives with metrics and reports
  • Minimum 5 years of hands-on experience in data engineering for an enterprise-level data lake or data warehouse with a strong understanding of ETL/ELT best practices, data integration, data modeling, and data transformation
  • Ability to work collaboratively in a fast-paced, Agile environment
  • Ability to effectively communicate technical concepts to non-technical stakeholders
  • Solid coding and problem-solving skills, including a high-level of attention to detail and an understanding of the importance of validating data quality and accuracy
  • Expertise in complex SQL programming
  • Proficiency with AWS Redshift stored procedures for efficient data manipulation and transformation
  • Hands-on experience working with other tools such as AWS Glue, S3, Athena and Apache Iceberg
  • Experience developing with Workday Finance and HR as a source system for a data lake or data warehouse.
  • Experience with version control systems such as Bitbucket or Git for maintaining a structured code repository
  • Experience with complex ETL scenarios, such as CDC and SCD, and integrating data from multiple source systems
  • Experience working in a highly complex Python / Pyspark object-oriented platform
  • Experience with data modeling, table management, and query optimization.
  • Ability to automate and optimize data delivery by improving existing processes and redesigning infrastructure for greater efficiency and scalability
  • Certification(s) or formal training related to AWS products, data engineering or databases
  • Experience with Tableau development and dashboard creation
  • generous paid time off
  • competitively priced health/dental/vision/life insurance
  • tax-advantaged savings accounts
  • participation in the nationally recognized Wisconsin Retirement System (WRS) pension fund
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service