Interface-posted 4 months ago
Full-time • Mid Level
Hybrid • Atlanta, GA
1,001-5,000 employees
Textile Product Mills

The Global Data Engineer will play a key role in designing, developing, and deploying high-performance data pipelines and infrastructure that power the enterprise data platform under the guidance of senior and lead engineers. Collaborating with Business Intelligence, Infrastructure, Business Analytics, and global IT and business teams, this role will help deliver scalable, production-ready solutions that support advanced analytics, reporting, and data-driven decision-making across the organization. The Data Engineer will contribute to the modernization of our data architecture, ensure high-quality, reliable data delivery, and support the implementation of governed self-service analytics using tools such as Power BI and Microsoft Fabric. This role is ideal for a data professional ready to contribute hands-on to enterprise projects and grow within a technically collaborative environment.

  • Design and develop scalable data models, pipelines, and infrastructure in Azure, driving insights, reporting, mobile/web applications, and machine learning
  • Support in data engineering and data science projects for global Big Data initiatives
  • Develop and automate high-volume, batch and real-time ETL pipelines using Azure Data Factory, Azure SQL Databases, Databricks, and Python
  • Use Microsoft Fabric and Power BI to create impactful semantic data models, dashboards, advanced DAX measures, calculated columns, data transformations and data visualizations
  • Technical expertise in performance tuning, Power Query, SQL scripting, row level security (RLS)
  • Deploy backend production services with an emphasis on high availability, robustness, and monitoring
  • Troubleshoot data processing performance issues and data quality problems with guidance from senior engineers
  • Design and execute test plans to validate the accuracy and completeness of data flowing through ETL pipelines and reports
  • Follow Continuous Integration process by committing all code to Version Control repositories
  • Collaborate with cross-functional teams to understand business requirements, create comprehensive test plans, and translate them into technical solutions, reports, and dashboards
  • Collaborate with development teams and solution architects to define infrastructure and deployment requirements for data warehousing and data modeling
  • Implement machine learning models in collaboration with data scientists
  • Work closely with senior data engineers and analysts to learn best practices in data modeling, performance tuning, and stakeholder delivery
  • Continuously increase knowledge of Business Intelligence applications and tools
  • Willingness to learn and understand Interface's commitment to sustainability
  • Performs other duties as assigned
  • Bachelor's degree in Computer Science, Engineering, Data Science or a related discipline
  • 4+ years of experience in Data Engineering, Software Engineering, Data Science, Machine Learning, and Artificial Intelligence using Snowflake, Azure or AWS cloud technologies
  • 4+ years of experience in Python programming, machine learning, artificial intelligence, system design, data structures, and algorithms in software development and high volume, distributed systems
  • 4+ years of experience in processing and modeling data in Python, SQL, Azure Analysis Services, Azure Data Factory, Databricks, SSAS, Qlik, Power BI, Microsoft Fabric, Tableau with a strong understanding of star and snowflake schemas, OLAP/OLTP and software engineering
  • 3+ years of experience working on an engineering team building out QA practices
  • Strong understanding of SQL and experience with relational databases
  • Strong understanding of data structures, data types, data transformation, and data performance tuning
  • Experience with Python and data transformation and quality check libraries such as PySpark, pandas, and Great Expectations
  • Strong Excel knowledge for validating data (VBA, macros, pivot tables, formulas, etc.)
  • Strong analytical, problem-solving, and debugging skills, with the ability to learn and comprehend business processes quickly
  • Experience with data integration and management tools
  • Knowledge of Power BI or other data visualization such as Tableau
  • Hands-on experience with Azure DevOps, Git, and other CI/CD tools
  • Knowledge of Infrastructure as Code (IaC) and provisioning tools like Terraform, Ansible, Jenkins, or ARM in Azure
  • Experience scripting languages such as PowerShell and JSON or YAML file formats
  • Experience with machine learning is a plus
  • Experience working on cross-functional teams and projects, and effectively communicating with stakeholders at multiple levels
  • Exceptional verbal and written communications skills, with an ability to express complex technical concepts in business terms
  • Solid organizational skills while working on multiple projects and ability to meet deadlines
  • Experience with machine learning is a plus
  • Access to LinkedIn Learning for skill development and industry trends
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service