Farm Credit-posted about 2 months ago
Full-time • Mid Level
Columbia, SC
1,001-5,000 employees
Executive, Legislative, and Other General Government Support

The Data Automation and Scheduling Operations Lead plays a key role in advancing data reliability, operational efficiency, and automation maturity. This position ensures scalable, resilient workflows that support business continuity and timely decision-making, while aligning scheduling operations with enterprise goals to drive innovation, reduce risk, and deliver critical data services.

  • Lead a high-performing team of engineers and analysts focused on job scheduling and data automation.
  • Align automation strategies with business goals to deliver measurable outcomes.
  • Manage and optimize enterprise job scheduling platforms (e.g., UC4, Autosys) for reliability and scalability.
  • Ensure secure, compliant operations through effective access controls and configurations.
  • Modernize automation workflows and migrate scheduling processes to cloud-native platforms (e.g., AWS).
  • Standardize automation practices for consistency, maintainability, and performance.
  • Partner with cloud engineering to design resilient, scalable automation frameworks.
  • Enhance observability and uptime through monitoring standards and proactive issue resolution.
  • Support real-time data capabilities by advancing event-driven and streaming automation.
  • Strengthen disaster recovery readiness wi th robust planning, testing, and platform lifecycle management.
  • Bachelor's degree in computer science or related field, or equivalent experience.
  • 6+ years in data operations, job scheduling, or automation engineering, with 2+ years in a leadership role.
  • Certifications: AWS Solutions Architect (Associate/Professional), AWS DevOps Engineer; FinOps and Control-M certifications a plus.
  • Strong experience with enterprise job scheduling platforms (e.g., UC4, Control-M, Autosys).
  • Proficient in cloud-native automation tools (e.g., AWS Step Functions, Lambda) and scripting languages (Python, Shell, PowerShell, SQL).
  • Familiarity with data pipeline orchestration tools (e.g., Apache Airflow, dbt) and disaster recovery principles.
  • Excellent leadership, communication, and collaboration skills with a proven ability to manage priorities and drive strategic initiatives.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service