Nike-posted 5 months ago
$149,100 - $313,900/Yr
Full-time • Senior
Remote • Beaverton, OR
5,001-10,000 employees
Leather and Allied Product Manufacturing

Become a Part of the NIKE, Inc. Team. NIKE, Inc. does more than outfit the world's best athletes. It is a place to explore potential, obliterate boundaries and push out the edges of what can be. The company looks for people who can grow, think, dream and create. Its culture thrives by embracing diversity and rewarding imagination. The brand seeks achievers, leaders and visionaries. At NIKE, Inc. it's about each person bringing skills and passion to a challenging and constantly evolving game.

  • Build and deliver scalable data and analytics solutions focused on Nike Direct, Supply Chain, and Commercial space.
  • Design, implement and integrate new technologies and evolve data and analytics products.
  • Contribute to all aspects of data and software engineering from ingestion, transformation, and consumption.
  • Design and build test-driven development, reusable frameworks, automated workflows, and libraries at scale to support analytics products.
  • Participate in architecture and design discussions to process and store high-volume data sets.
  • Bachelor's or master's degree in computer science or related technical subject area or equivalent combination of education and experience.
  • 10+ years experience as a software engineer, data engineer, or architect building, designing and coding in a distributed data management systems for a large-scale data architecture, including data lakes, data warehouses, and cloud-based solutions.
  • 5+ years experience working with Hadoop and Big Data processing frameworks (Spark, Hive, NiFi, Spark-Streaming, Flink, etc.).
  • Strong hands-on experience with proficiency in modern data technologies and tools, such as Spark, SQL, NoSQL, and cloud platforms (e.g., AWS, Azure, Google Cloud).
  • Proficient in programming languages like Python, Scala, or Java, with a deep understanding of data ingestion, transformation, and pipeline optimization.
  • Excellent experience with source control tools such as GitHub and knowledge of CI/CD, DevOps and distributed systems.
  • Experience with workflow scheduling tools like Airflow.
  • Experience provisioning RESTful APIs to enable real-time data consumption.
  • Solid foundation in data modeling, data structures, and algorithmic design, with experience supporting AI/ML and advanced analytics use cases.
  • Proven ability to define and operationalize data quality metrics, SLAs, and observability in complex data pipelines.
  • Excellent leadership, communication, and collaboration skills, with a track record of mentoring and fostering a culture of technical excellence.
  • Track record of eliminating workflow inefficiencies and reducing manual toil through automation.
  • A passion for innovation and staying current with industry trends and emerging technologies.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service