The Clorox Company-posted 3 days ago
Full-time • Senior
Hybrid • Pleasanton, CA
5,001-10,000 employees

Clorox is the place that’s committed to growth – for our people and our brands. Guided by our purpose and values, and with people at the center of everything we do, we believe every one of us can make a positive impact on consumers, communities, and teammates. Join our team. #CloroxIsThePlace Your role at Clorox: We are seeking an experienced and highly skilled senior data engineer to join our enterprise data strategy and operations team. The ideal candidate will have extensive expertise in designing, building and maintaining data pipelines and data solution architectures on cloud platforms, particularly Azure. This role involves leading data engineering products, optimizing data processing workflows, ensuring data quality and governance, and collaborating with cross-functional teams to support analysis and insights generation to fuel large scale ideation and roadmapping associated with revenue growth and margin improvements projects at scale across the enterprise. The role of Senior Data Engineer at Clorox will play a key role in leading and delivering enterprise quality data solutions that can enable data driven business decisions. Role is an analytical big picture thinker with a product mindset and strong background in business intelligence and engineering in cloud platform, that can leverage technology and build scalable data products that create value across the organization.This role will also serve as a key collaborator with our business analytics and enterprise technology stakeholders to innovate, build and sustain the cloud data infrastructure that will further Clorox’s digital transformation efforts. In this role, you will: Collaborate & Lead: Work closely with business product owners, data scientists, analysts, and cross-functional stakeholders to understand the business’ data needs and provide technical solutions. Influence business partners to align to the technical solutions and to adhere to technical architecture standards. Provide technical guidance to junior engineers, BI developers, and contractors to create efficient and effective data solutions. Architecting and Innovate: Strong proficiency in Python, Spark, SQL, PySQL, Pandas, CI/CD methodologies is required. Strong data ingestion, data modeling and dimensional modeling skills using medallion lake house architecture. Strong BI skills to build reports & dashboards using Power BI and Tableau etc. Experience in reporting security like row level, column level, object level and masking etc. Experience with SQL and DML to recast data in backend database for data changes, restatements and data processing errors, etc. Experience with ML Ops and supporting Data Science workflow pipelines. Knowledge of Gen AI frameworks and LLMs to support agentic products Optimize and Scale: Build and maintain data pipelines to integrate data from various source systems. Optimize data pipelines for performance, reliability and cost-effectiveness. Work with enterprise infrastructure and technology teams to implement best practices for performance monitoring, cloud resource management, including scaling, cost control and security. Ensure Quality and Governance: Ensure safe custody, transport and storage of data in the data platforms. Collaborate with Data Governance Stewards and Business Stakeholders to enforce the business rules, data quality rules and data cataloging activities. Ensure data quality, security and compliance for the data products responsible under this role. Enhance BI Capabilities: Develop and manage business intelligence solutions for the organization to transform data into insights that can drive business value. Help Analytics Product Owners and Business Leaders improve business decisions through data analytics, data visualization, and data modeling techniques and technologies.

  • Work closely with business product owners, data scientists, analysts, and cross-functional stakeholders to understand the business’ data needs and provide technical solutions.
  • Influence business partners to align to the technical solutions and to adhere to technical architecture standards.
  • Provide technical guidance to junior engineers, BI developers, and contractors to create efficient and effective data solutions.
  • Build and maintain data pipelines to integrate data from various source systems.
  • Optimize data pipelines for performance, reliability and cost-effectiveness.
  • Work with enterprise infrastructure and technology teams to implement best practices for performance monitoring, cloud resource management, including scaling, cost control and security.
  • Ensure safe custody, transport and storage of data in the data platforms.
  • Collaborate with Data Governance Stewards and Business Stakeholders to enforce the business rules, data quality rules and data cataloging activities.
  • Ensure data quality, security and compliance for the data products responsible under this role.
  • Develop and manage business intelligence solutions for the organization to transform data into insights that can drive business value.
  • Help Analytics Product Owners and Business Leaders improve business decisions through data analytics, data visualization, and data modeling techniques and technologies.
  • 7+ years of exp if the candidate holds BS degree in Computer Science, Information Systems or relevant streams; 5-7 years of exp if the candidate holds MS/PhD degree
  • BS degree or higher in Computer Science, Information Systems or relevant streams
  • Experience in architecting data solutions, cloud data engineering, end to end data warehouse or lake house implementations, end to end business intelligence implementations
  • Minimum 7 years of experience with data engineering, data warehousing, business intelligence with substantial experience in managing large-scale data projects
  • 5+ years’ experience with data solutions implementations in Cloud platform technologies like Microsoft Azure, AWS etc.
  • 4+ years with business intelligence using technologies like Power BI, Tableau etc.
  • Experience in end-to-end support for data engineering solutions (Data Pipelines), including designing, developing, deploying, and supporting solutions for existing platforms
  • Data engineering, data warehousing, business intelligence with substantial experience in managing large-scale data projects.
  • Experience in end-to-end support for data engineering solutions (Data Pipelines), including designing, developing, deploying, and supporting solutions for existing platforms
  • Build data solutions in Cloud platform technologies like Azure, AWS etc. and using data tools like Data Factory, Databricks and Delta Lake.
  • Experience leading data projects and teams.
  • Ability to mentor junior engineers and collaborate effectively with stakeholders to deliver data-driven solutions.
  • Excellent verbal and written communication skills.
  • Ability to articulate ideas to both technical and non-technical audiences.
  • Strong problem-solving skills with an emphasis on product development.
  • Ability to translate business requirements into non-technical terms and back into technical implementations.
  • Creating reusable engineering frameworks like quality check, logging, scheduling, monitoring and alerting.
  • 4+ years of experience with Azure services like Data Factory, Databricks, and Delta Lake will be an added advantage.
  • Knowledge or experience in Microsoft D365 Dataverse and reporting in Microsoft Fabric technology
  • Knowledge or experience in Microsoft D365 Dataverse and reporting in Microsoft Fabric technology
  • Strong proficiency in Python, Spark, SQL, PySQL, Pandas, CI/CD methodologies is required.
  • Strong data ingestion, data modeling and dimensional modeling skills using medallion lake house architecture.
  • Strong BI skills to build reports & dashboards using Power BI and Tableau etc.
  • Experience in reporting security like row level, column level, object level and masking etc.
  • Experience with SQL and DML to recast data in backend database for data changes, restatements and data processing errors, etc.
  • Experience with ML Ops and supporting Data Science workflow pipelines.
  • Knowledge of Gen AI frameworks and LLMs to support agentic products
  • comprehensive, competitive benefits that prioritize all aspects of wellbeing and provide flexibility for our teammates’ unique needs
  • robust health plans
  • a market-leading 401(k) program with a company match
  • flexible time off benefits (including half-day summer Fridays depending on location)
  • inclusive fertility/adoption benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service