Nike-posted 3 months ago
Full-time • Senior
Memphis, TN
5,001-10,000 employees
Leather and Allied Product Manufacturing

As a Senior Data Engineer at NIKE RETAIL SERVICES INC., you will design and implement data products and features in collaboration with product owners, data analysts, and business partners using Agile / Scrum methodology. You will contribute to the overall architecture, frameworks, and patterns for processing and storing large data volumes. Your role will involve evaluating and utilizing new technologies, tools, and frameworks centered around high-volume data processing. You will translate product backlog items into engineering designs and logical units of work, analyze data for the purpose of designing scalable solutions, and define and apply appropriate data acquisition and consumption strategies for given technical scenarios. Additionally, you will design and implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem, build utilities, user-defined functions, libraries, and frameworks to better enable data flow patterns, and implement complex automated routines using workflow orchestration tools. You will also anticipate, identify, and tackle issues concerning data management to improve data quality, build and incorporate automated unit tests, and participate in integration testing efforts. Telecommuting is available only in AR, MS, and TN.

  • Design and implement data products and features in collaboration with product owners, data analysts, and business partners using Agile / Scrum methodology.
  • Contribute to overall architecture, frameworks, and patterns for processing and storing large data volumes.
  • Evaluate and utilize new technologies/tools/frameworks centered around high-volume data processing.
  • Translate product backlog items into engineering designs and logical units of work.
  • Analyze data for the purpose of designing scalable solutions.
  • Define and apply appropriate data acquisition and consumption strategies for given technical scenarios.
  • Design and implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem.
  • Build utilities, user-defined functions, libraries, and frameworks to better enable data flow patterns.
  • Implement complex automated routines using workflow orchestration tools.
  • Anticipate, identify, and tackle issues concerning data management to improve data quality.
  • Build and incorporate automated unit tests and participate in integration testing efforts.
  • Must have a Master's degree in Computer Science, Computer Engineering or Applied Computer Science.
  • 2 years of experience in the job offered or a computer-related occupation.
  • Experience with Snowflake.
  • Experience with AWS Native Services.
  • Experience with Oracle.
  • Experience with DB2.
  • Experience with MS SQL Server.
  • Experience with MS SSIS.
  • Experience with Attunity.
  • Experience with Apache Airflow.
  • Experience with Python.
  • Experience with Confluence.
  • Experience with Tableau.
  • Experience with GitHub.
  • Experience with DataBricks.
  • Experience with Jenkins.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service