About The Position

Oowlish, a rapidly expanding software development company in Latin America, is seeking experienced technology professionals to join its diverse and vibrant team. As a member, you will collaborate with premier clients from the United States and Europe on pioneering digital solutions. Oowlish is certified as a Great Place to Work, offering professional development, growth, and international impact, with the convenience of remote work for a balanced lifestyle. Candidates should be passionate about technology, proficient in English, and excited for remote, worldwide collaboration. The role is for a Data Engineer with strong experience in building scalable data pipelines and exposure to Machine Learning workflows. This position focuses on designing and maintaining robust data infrastructure while supporting data-driven and AI-powered applications. You will work closely with data scientists, engineers, and product teams to ensure data is reliable, accessible, and ready for advanced use cases, combining strong engineering fundamentals with the ability to support ML pipelines and data workflows in production environments.

Requirements

  • 4+ years of experience in Data Engineering or similar roles
  • Strong proficiency in Python for data processing and pipeline development
  • Hands-on experience with Apache Airflow (or similar orchestration tools)
  • Experience building and maintaining ETL/ELT pipelines in production
  • Strong knowledge of SQL and relational databases
  • Experience working with large-scale datasets
  • Exposure to Machine Learning workflows or data pipelines supporting ML models
  • Experience working with cloud environments (AWS, GCP, or Azure)
  • Strong problem-solving skills and ability to work independently

Nice To Haves

  • Experience with ML frameworks (TensorFlow, PyTorch, Scikit-learn)
  • Experience with data warehouses (Snowflake, BigQuery, Redshift)
  • Experience with streaming technologies (Kafka, Kinesis)
  • Familiarity with feature engineering and model data preparation
  • Experience with CI/CD pipelines for data workflows

Responsibilities

  • Design, build, and maintain scalable data pipelines using Python and Airflow
  • Develop and optimize ETL/ELT processes for structured and unstructured data
  • Collaborate with data science teams to support Machine Learning workflows
  • Ensure data quality, reliability, and performance across systems
  • Work with large datasets and optimize queries and transformations
  • Integrate data from multiple sources and external systems
  • Monitor and improve pipeline performance and reliability
  • Support deployment and maintenance of data-driven and ML-enabled applications

Benefits

  • Home office
  • Competitive compensation based on experience
  • Career plans to allow for extensive growth in the company
  • International Projects
  • Oowlish English Program (Technical and Conversational)
  • Oowlish Fitness with Total Pass
  • Games and Competitions

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

11-50 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service