Data Engineer

Analytica S.A.S.
33d

About The Position

Analytica is looking for an experienced ETL Data Engineer to design, build, optimize, and support scalable data pipelines using Informatica and AWS. You'll work closely with data architects, analysts, and business stakeholders to ensure reliable, secure, and high-quality data delivery across enterprise systems. This role is ideal for someone who enjoys solving data integration challenges, improving performance and reliability, and contributing to modern cloud-based data platforms. Analytica has been recognized by Inc. for 3 consecutive years as one of the 250 fastest growing business. We offer competitive compensation with opportunities for bonuses, employer paid health care, training and development funds, and 401k match.

Requirements

  • 3+ years of hands-on ETL/data engineering experience
  • Strong experience with Informatica PowerCenter and/or Informatica Intelligent Cloud Services (IICS)
  • Experience building data solutions in AWS, including services such as: S3, IAM, CloudWatch Glue, Lambda, Step Functions Redshift, RDS, or Aurora
  • Strong SQL skills (query optimization, joins, windowing, performance tuning)
  • Python scripting for automation or integration support
  • Experience with data warehouse concepts and dimensional modeling
  • Familiarity with SDLC practices and version control tools (Git, Bitbucket, etc.)
  • Ability to troubleshoot pipeline failures and performance bottlenecks independently
  • Strong communication skills and comfort working cross-functionally

Nice To Haves

  • Experience with Informatica Cloud connectors and Cloud Data Integration patterns
  • Exposure to Databricks or modern lakehouse platforms
  • CI/CD pipeline experience (Jenkins, GitHub Actions, Azure DevOps, etc.)
  • Experience with metadata management, lineage, and governance tools
  • Familiarity with Agile/Scrum methodologies

Responsibilities

  • Design, develop, and maintain ETL workflows using Informatica (PowerCenter and/or IICS)
  • Build and optimize cloud-based data pipelines and integrations on AWS
  • Develop and support ingestion, transformation, and loading processes from multiple source systems (SQL, APIs, flat files, etc.)
  • Implement and maintain data quality, validation, and reconciliation checks
  • Manage scheduling, dependencies, and monitoring of ETL jobs; troubleshoot failures and performance issues
  • Collaborate with data architects to implement target models in data lakes/warehouses
  • Ensure pipelines follow security, governance, and compliance standards
  • Contribute to best practices for CI/CD, version control, environment promotion, and deployment automation
  • Document ETL design, mappings, lineage, and operational processes
  • Participate in on-call or production support rotations as needed

Benefits

  • competitive compensation with opportunities for bonuses
  • employer paid health care
  • training and development funds
  • 401k match
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service