Core & Main-posted 3 days ago
Full-time • Mid Level
Saint Louis, MO
5,001-10,000 employees

Based in St. Louis, Core & Main is a leader in advancing reliable infrastructure™ with local service, nationwide®. As a specialty distributor with a focus on water, wastewater, storm drainage and fire protection products and related services, Core & Main provides solutions to municipalities, private water companies and professional contractors across municipal, non-residential and residential end markets, nationwide. With over 370 locations across the U.S., the company provides its customers local expertise backed by a national supply chain. Core & Main’s 5,700 associates are committed to helping their communities thrive with safe and reliable infrastructure. Visit coreandmain.com to learn more. Job Summary Design, develop, test, and maintain data solutions and pipelines. Serve as a subject matter expert in data engineering, data integration, and cloud-based data platforms. Maintain, enhance, and provide solutions for data warehousing and analytics environments. Take ownership of technical deliverables and ensure high-quality, reliable data solutions that meet business needs.

  • Collaborate with stakeholders to analyze data solution requirements, identify gaps, and assess feasibility.
  • Take responsibility for the design, documentation, and implementation of technical solutions that meet business and functional requirements.
  • Develop, implement, and maintain data pipelines, data warehouses, and data lake solutions.
  • Ensure data systems possess sufficient controls and meet compliance standards.
  • Perform unit testing prior to moving code/configuration to the QA process.
  • Evaluate and research upgrades, patches, and new functionality.
  • Troubleshoot and resolve defects in data solutions.
  • Contribute to the development and definition of test plans and scripts for performance, regression, and user acceptance testing; support QA activities as required.
  • Build and maintain data models, data mappings, transformation rules, workflows, data extractions and imports, interfaces, and object models.
  • Ensure data solutions comply with security protocols and data governance standards.
  • Share expertise with team members and participate in peer reviews to uphold technical standards.
  • 5+ years of hands-on development experience in SQL and/or Python for data warehouse management, data integration, and data lake management.
  • Deep working knowledge in SQL development using T-SQL code to design, implement, and optimize complex database objects, such as tables, views, stored procedures, indexes, and functions.
  • Experience working with Azure data architecture, including a solid understanding of tools for building data pipelines on cloud-based data platforms, such as Delta Lakehouse Medallion architecture and data warehousing solutions.
  • Exposure to modern Spark-based data platforms like Databricks or Microsoft Fabric for data engineering tasks, including leveraging their capabilities for scalable data processing, analytics, and machine learning workflows in a cloud-based environment.
  • Understanding of ELT vs ETL and how to build efficient data pipelines with modern Change Data Capture processes.
  • Hands-on experience with CI/CD pipelines in Azure DevOps and understanding of Agile development methodologies.
  • Familiarity with common data mapping and transformation techniques for Dynamics 365 Data Entities and Data Management Framework for the Finance and Operations modules.
  • Familiarity with Power BI and its integration with Microsoft Fabric for end-to-end analytics.
  • Strong communication skills with the ability to translate complex technical concepts into business-friendly language.
  • Bachelor’s degree in computer science, Information Technology, or related field.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service