Sr Engineer, Ent Arch

Invesco Ltd.Houston, TX
Hybrid

About The Position

Invesco is a leading independent global investment firm dedicated to rethinking possibilities for its clients. By delivering the combined power of its distinctive investment management capabilities, Invesco provides a wide range of investment strategies and vehicles to clients worldwide. This role contributes to the development and maintenance of enterprise data platform infrastructure, building enterprise-grade data platform capabilities to support Client Experience, Regulatory, and Investment teams. The position involves implementing data entitlement policies, designing and managing ETL/ELT pipelines, enhancing data capabilities through cloud migrations, and managing messaging queues. The engineer will work closely with business stakeholders to design data platforms that integrate seamlessly into the broader ecosystem, ensuring operational resiliency. The role also includes developing and maintaining data infrastructure on multiple cloud platforms, implementing robust security and access control, provisioning infrastructure using Infrastructure-as-Code (IaC), developing CI/CD pipelines, managing containerization, integrating security scanning tools, optimizing system performance, and developing data set patterns for discovery, modeling, mining, and archival. Ensuring business continuity through disaster recovery planning and high-availability setups, and maintaining data platform compliance with regulatory standards and internal governance policies are also key responsibilities.

Requirements

  • Must have a Bachelor’s degree in Computer Science, Computer Engineering, Electrical/Electronic Engineering or related technical field
  • Must have 5 years of progressive experience in data engineering, positions performing/utilizing the following:
  • Expertise in Snowflake, Airflow, AWS, and modern data stack platforms.
  • Strong understanding of data governance, quality, and security principles.
  • Experience with microservices in cloud environments and core AWS services.
  • Containerization and orchestration
  • Building data ingestion pipeline from Rest APIs, flat files, and cloud storage.
  • Proficiency in system monitoring, logging, and alerting tools.
  • Experience in high-availability and disaster recovery planning.
  • Hands-on experience with: Airflow, BizTalk, Snowflake, IIS; IBM WebSphere Messaging (WMQ), Amazon MQ; PostgreSQL, SQL Server, Oracle; AWS cloud services including EC2, Lambda, EKS, RDS, CloudWatch, Certificate Manager, KMS, SNS, Glue, Antenna, S3, Route53, ALB, and IAM; Terraform, AWS CloudFormation, and Git; Azure DevOps, Bitbucket, and JFrog Artifactory; Fortify, Prisma, WhiteSource, and SonarQube; Python, PowerShell, and MSBuild; JIRA, Confluence, and ServiceNow; Power BI and Tableau; Python, SQL and DBT; and Orchestration tolls incl Airflow and Airbyte.

Responsibilities

  • Contribute to the development and maintenance of enterprise data platform infrastructure, building-out enterprise grade data platform capabilities that support our Client Experience, Regulatory, and Investment teams.
  • Implement data entitlement policies using tagging, masking, and attribute-based controls.
  • Design and manage ETL/ELT pipelines triggered by schedules, upstream events, or business needs.
  • Enhance data capabilities by architecting, designing and delivering data migrations to cloud solutions like snowflake, Aurora etc.
  • Work closely with the data modeler and architecture group to establish tools and practices for enterprise data modeling needs.
  • Manage and maintain messaging queues including infrastructure, creation of queues, topics, subscriptions, channels, remote queues and designing the secure message flow with enabling encryptions & handing the authorizations & authentication.
  • Work closely with business stakeholders to identify required business functionality and design data platforms that Integrate seamlessly into the broader ecosystem while supporting business, functional and performance needs while ensuring operational resiliency.
  • Develop, maintain and support data infrastructure on multiple cloud platforms.
  • Design and implement robust security and access control mechanisms across all data platforms.
  • Provision the infrastructure and resources with the Infrastructure-as-Code (IaC)
  • Develop, implement and maintain CI/CD pipelines to automate software build, test, and deployment processes across multiple environments.
  • Develop and manage the containerization images and use container orchestration to deploy the images, including the scaling, rolling updated and self-healing.
  • Integrate security scanning tools into development workflows.
  • Optimize system performance and reliability through proactive monitoring.
  • Develop data set patterns and processes for data discovery, modeling, mining, and archival
  • Ensure business continuity through disaster recovery planning and high-availability setups.
  • Experience in data platform compliance with regulatory standards and internal governance policies, including data privacy, access control, and audit readiness.

Benefits

  • Flexible paid time off
  • Hybrid work schedule
  • 401(K) matching of 100% up to the first 6% with a discretionary supplemental contribution
  • Health & wellbeing benefits
  • Parental Leave benefits
  • Employee stock purchase plan
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service