Lead IT Data Engineer (SQL/Python)

Tyson Foods, Inc.Springdale, AR
1d

About The Position

The Lead IT Data Engineer owns and drives the organization's data strategy, setting the technical direction for data engineering, enterprise data modeling, and agentic AI adoption. This role is responsible for platform architecture, tool selection, cloud strategy, budget and capacity planning, and the design of enterprise AI agent systems — all while establishing governance frameworks for responsible data and AI practices and mentoring the engineering team.

Requirements

  • Bachelor's Degree or relevant experience.
  • 5+ years of relevant and practical experience.
  • Expert proficiency in Python and SQL for data engineering at scale.
  • Expertise in modern data platforms (Databricks, Snowflake, BigQuery), lakehouse architectures (Delta Lake, Iceberg), and streaming (Kafka, Flink, Pub/Sub).
  • Deep expertise in at least one major cloud platform with cross-cloud awareness.
  • Mastery of orchestration, transformation (dbt), containerization (Docker, K8s), and IaC (Terraform).
  • Advanced enterprise data modeling, warehousing, data contracts, and API design.
  • Expertise in CI/CD, data observability, governance, data mesh, and platform reliability.
  • Experience in technical roadmap ownership, build-vs-buy evaluation, and budget/capacity planning.
  • Expert-level knowledge of agentic AI architectures, LLMOps, RAG, knowledge graphs, and AI governance/safety/security.
  • Leadership: Owning and driving data and AI strategy across the organization.
  • Strategic Vision: Translating business objectives into actionable technical roadmaps.
  • Stakeholder Management: Building relationships with partners and executive leadership.
  • Communication: Presenting complex data and AI concepts to board-level audiences.
  • Mentorship: Developing the data engineering team's data and AI competencies.
  • Decision-Making: Making high-impact choices on architecture, platforms, and investments.
  • Change Management: Guiding the organization through data and AI transformations.
  • Innovation & Thought Leadership: Driving industry best practices in data engineering, modeling, and agentic AI.
  • Negotiation: Balancing technical requirements with business needs and resource constraints.
  • Not eligible for visa sponsorship now or in the future

Nice To Haves

  • AWS Solutions Architect Professional, Google Professional Data Engineer, Azure Solutions Architect Expert, Databricks Certified Data Engineer Professional, or equivalent.

Responsibilities

  • Own and drive the overall data strategy, including the multi-quarter technical roadmap, platform architecture, and data engineering standards.
  • Architect end-to-end data solutions across cloud platforms (AWS, GCP, or Azure), setting standards for orchestration (Airflow, Dagster), transformation (dbt), streaming (Kafka, Flink), and storage (Delta Lake, Iceberg, Snowflake, BigQuery).
  • Own the enterprise data modeling strategy — crafting scalable models using dimensional, multi-dimensional, and advanced normalization techniques, with enterprise-wide documentation and metadata governance.
  • Define API design standards and data contracts to ensure reliable, well-governed interfaces between data producers and consumers.
  • Establish enterprise-level data governance, security, and compliance frameworks across all data and AI systems, including access controls, cataloging, and lineage.
  • Define and enforce CI/CD standards for data pipelines, containerized architectures (Docker, K8s), and infrastructure as code (Terraform).
  • Drive data observability practices and platform reliability at enterprise scale.
  • Drive build-vs-buy evaluations for data and AI tools, considering TCO, vendor lock-in, scalability, and organizational fit; manage vendor relationships.
  • Own or co-own infrastructure budget and capacity planning for data platform resources; optimize cloud costs at the organizational level.
  • Define and drive the organization's agentic AI strategy, architecting enterprise- scale multi-agent systems, autonomous data pipelines, and RAG/knowledge graph platforms.
  • Establish AI governance frameworks, including ethics policies, bias detection, safety guardrails, security standards (prompt injection, data exfiltration, PII), and compliance with emerging regulations (e.g., EU AI Act).
  • Establish LLMOps practices at scale — model deployment, prompt versioning, A/B testing, performance monitoring, drift detection, and cost optimization.
  • Design human-in-the-loop escalation paths for critical AI-driven decisions, ensuring appropriate oversight.
  • Lead AI platform evaluation and integration, including TCO analysis, data residency, and SLA requirements for agentic frameworks.
  • Set software engineering best practices — code review standards, design patterns, technical debt management, and documentation.
  • Advocate for and lead adoption of data mesh and data-as-a-product principles.
  • Mentor the engineering team on data engineering, data modeling, and AI best practices.
  • Perform other assigned job-related duties that align with our organization's vision, mission, and values and fall within your scope of practice.

Benefits

  • We provide our team members and their families with paid time off; 401(k) plans; affordable health, life, dental, vision and prescription drug benefits; and more.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service