About The Position

We are looking for a Senior Data Engineer specializing in Supply Chain data within our Data and Analytics team. This is an exciting opportunity to help grow and modernize analytics at Hormel Foods! Individuals interested in this position will need strong communication skills, with the ability to collaborate across all levels of the organization. You will be responsible for managing a team and simultaneous initiatives that require innovative problem-solving and technical expertise. You will use tools such as Google Cloud Platform, SQL, Python, Incorta, Oracle Business Intelligence, Tableau, and Informatica ETL to engineer data pipelines and data models that enhance enterprise reporting and analytics. Additionally, you will design and develop reports, dashboards, and visualizations using enterprise business intelligence tools (Oracle and Tableau). Specific competencies include: Data Structures and Models - Designs, develops and scales the overall database/data warehouse structure based on functional and technical requirements. Designs, develops and scales data collection frameworks for structured and unstructured data. Data Pipelines and ELT - Designs, applies and scales data extraction, loading and transformation techniques in order to connect large data sets from a variety of sources. Data Performance - In complete autonomy, troubleshoots and fixes for data performance issues that come with querying and combining large volumes of data. Accounts for scaled performance in initial design. Visualizations and Dashboards - Gathers requirements, designs and develops reports, dashboards and visualizations with multiple sources that meet business needs. Understands data and ideates ways for business to leverage data in innovative ways.

Requirements

  • A bachelor's degree in Computer Science, MIS, or related area and significant experience with business intelligence, data engineering and data modeling.
  • 7+ years of experience with reading and writing SQL.
  • 7+ years of experience engineering within a data warehouse or related experience with dimensional data modeling.
  • 7+ years of experience designing and developing ETLs/pipelines in Python, Google BigQuery Dataprocs and/or Informatica ETL.
  • Proven ability to gather detailed technical requirements to design and develop data structures supporting business intelligence report solutions from beginning to end.
  • Excellent written and verbal communication skills.
  • Strong organizational and time management skills.
  • Tested problem-solving and decision-making skills.
  • A strong pattern of initiative.
  • Highly developed interpersonal and leadership skills.
  • Demonstrated success in leading cross-functional data projects.
  • Applicants must not now, or at any time in the future, require employer sponsorship for a work visa.
  • Applicants must be authorized to work in the United States for any employer.

Nice To Haves

  • 5+ years of experience designing and developing within a business intelligence/reporting tool like Oracle Business Intelligence, Tableau or Power BI.
  • Experience with Supply Chain data, including logistics, inventory, procurement, and manufacturing.
  • Familiarity with ERP systems and supply chain planning tools.
  • Experience working within Google Cloud Platform with services like Dataflow, Datafusion, Pub/Sub, Cloud SQL, Cloud Storage
  • Advanced Oracle SQL including advanced functions like analytical functions.
  • Experience in tuning complex SQL and ETLs.
  • Experience within a core semantic layer platforms including the physical, logical and presentation layers for the enterprise business intelligence platform (RPD in OBIEE - Oracle Business Intelligence Enterprise Edition or AtScale semantic layer)

Responsibilities

  • Works directly with the Supply Chain business unit to understand their analytics needs and gather requirements for analytics solutions
  • Design and implement physical and logical data models for dimensions and facts across staging, warehouse, and semantic layers of enterprise data platforms.
  • Develop and optimize SQL, Python, Incorta, or Informatica ETLs and pipelines, including Google BigQuery Dataprocs, to ingest and transform data from diverse sources into dimensional models.
  • Utilize SQL within Google BigQuery, Informatica ETLs, Incorta pipelines, or Oracle SQL Views to derive metrics and dimension attributes.
  • Engineer and orchestrate batch and mini-batch data loads into enterprise data platforms.
  • Provide ongoing support and maintenance for existing Supply Chain data solutions within enterprise data warehouses.
  • Use tools such as SQL, Oracle Business Intelligence, Tableau, Google Cloud Platform, Python, Incorta, and Informatica ETL to build robust data pipelines and models that support enterprise analytics.
  • Design and develop dashboards and reports with intuitive user interfaces and optimized performance for Supply Chain operations.

Benefits

  • Competitive base salary plus bonus, annual merit increase performance reviews, medical, dental, vision, non-contributory pension, profit sharing, 401(k), stock purchase plan, relocation assistance, paid vacation.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service