Data Engineer ID50062

AgileEngineMiami, FL
3dHybrid

About The Position

This Senior Data Engineer (Python) role is central to transforming large, diverse datasets into reliable insights that support research and strategic decisions across a global financial platform. You will help shape a unified data ecosystem, partnering with data scientists, researchers, and stakeholders to connect technology with real business impact. What makes this opportunity unique is the scale of data, use of modern cloud, AI, and data engineering practices, and strong influence on platform evolution. It’s a chance to grow technically while contributing to a mission-driven, collaborative environment.

Requirements

  • You must be authorized to work for ANY employer in the US, as employment visa sponsorship is not available
  • Bachelor’s degree in computer science/engineering or other technical field, or equivalent experience
  • 5+ years of experience with Python with strong hands-on expertise
  • 5+ years of experience with data processing and analytics libraries such as Pandas, Polars, PySpark, and DuckDB
  • 2+ years of experience with Big Data technologies such as Spark and Snowflake
  • Expert-level knowledge of Airflow or similar pipeline orchestration tools
  • Deep understanding of Medallion Architecture, columnar file formats, and database technologies including SQL, NoSQL, and Lakehouse architectures
  • Proven ability to work with third-party APIs for complex data ingestion
  • Proficiency with cloud platforms such as AWS, GCP, and Snowflake, including advanced SQL optimization
  • Upper-intermediate English level

Nice To Haves

  • Familiarity with the fintech industry and financial data domains
  • Documentation skills for data pipelines, architecture designs, and best practices
  • OpenSearch or Elasticsearch
  • AWS SageMaker Studio and Jupyter for data analysis
  • Terraform
  • Scala

Responsibilities

  • Design and build scalable Data Lakes, Data Warehouses, and Data Lakehouses
  • Design and implement robust ETL/ELT processes at scale using Python and pipeline orchestration tools like Airflow
  • Develop ingestion workflows from diverse third-party APIs and data sources
  • Manage and optimize file formats such as Parquet, Avro, and ORC for high-performance data retrieval
  • Work with AI development tools to support machine learning initiatives and advanced analytics
  • Act as a technical consultant to gather requirements, understand business goals, and translate them into technical roadmaps
  • Work with Terraform and other tools to build AWS and on-prem infrastructure.

Benefits

  • Professional growth: Mentorship, TechTalks, and personalized growth roadmaps.
  • Competitive compensation: USD-based pay with education, fitness, and team activity budgets.
  • Exciting projects: Modern solutions with Fortune 500 and top product companies.
  • Flextime: Flexible schedule with remote and office options.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service