Cox Communications-posted about 1 year ago
$106,700 - $177,900/Yr
Full-time • Manager
Remote • Raleigh, NC
Broadcasting and Content Providers

The Lead Data Engineer at RapidScale is responsible for modernizing data infrastructure by migrating customer systems from legacy on-premise solutions to cloud-based architectures using GCP, AWS, and Azure. This role involves designing, building, and maintaining scalable data pipelines and storage solutions, collaborating with cross-functional teams, and implementing best practices in data engineering.

  • Implement data lake, data warehousing, ETL, streaming, and data analytics solutions across GCP, AWS, and Azure platforms.
  • Migrate data and processes from legacy on-premise systems (e.g., SQL Server, relational databases) to cloud-based solutions.
  • Design and develop efficient, scalable data pipelines to ingest, process, and transform data from various sources.
  • Optimize data storage and retrieval systems for performance and cost-efficiency.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver appropriate solutions.
  • Implement data quality checks and monitoring systems to ensure data integrity and reliability.
  • Contribute to the development of best practices and standards for data engineering within the organization.
  • Participate in code reviews and knowledge sharing sessions with the team.
  • Implement and maintain CI/CD pipelines for data engineering projects.
  • Use Infrastructure as Code (IaC) practices to manage and version cloud resources.
  • Bachelor's degree in a related discipline and 6 years' experience in a related field OR a Master's degree and 4 years' experience OR a Ph.D. and 1 year of experience OR 18 years' experience in a related field.
  • 3 years of hands-on experience with cloud data solutions (GCP, AWS, Azure) designing and implementing solutions in at least one of these platforms.
  • Experience in two programming languages (e.g., Node.js, Go, Python or others) with a working knowledge of additional languages.
  • Experience with transforming legacy code (e.g., Java, .Net) into cloud-native microservices.
  • Experience building and maintaining CI/CD pipelines and utilizing cloud automation tools for efficient software deployment.
  • Experience in Python or SQL programming languages.
  • Experience with Terraform for Infrastructure as Code (IaC).
  • Experience with machine learning and AI services (e.g. Google Cloud AI Platform, Amazon SageMaker, Azure Machine Learning).
  • Knowledge of data modeling, data warehousing concepts, and dimensional modeling.
  • Professional certifications from GCP, AWS, and/or Azure.
  • Knowledge of containerization technologies (e.g., Docker, Kubernetes).
  • Flexible vacation policy allowing employees to take as much vacation with pay as they deem consistent with their duties.
  • Seven paid holidays throughout the calendar year.
  • Up to 160 hours of paid wellness annually for their own wellness or that of family members.
  • Additional paid time off for bereavement leave, time off to vote, jury duty leave, volunteer time off, military leave, parental leave, and COVID-19 vaccination leave.
  • Health care insurance (medical, dental, vision).
  • Retirement planning (401(k)).
  • Paid days off (sick leave, parental leave, flexible vacation/wellness days, and/or PTO).
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service