Senior Cloud Data Architect

BoeingLong Beach, CA
Onsite

About The Position

Boeing has a current need for a Senior Cloud Data Architect to deliver enterprise data pipelines and platform components on AWS and Databricks; hands‑on contributor who drives implementation, performance tuning, and mentoring. Drive the modernization of legacy ETL pipelines to a scalable, configuration-driven ETL framework running on AWS.

Requirements

  • Bachelor’s Degree or higher in Computer Science, Engineering, Information Systems, or equivalent practical experience
  • Demonstrated ability to lead technical initiatives, mentor peers, and communicate effectively across distributed teams.
  • 5+ years` experience with ETL tools and patterns (e.g., DataStage, Informatica) and building repeatable ETL/ELT pipelines
  • 5+ years` hands‑on experience building large‑scale big data applications using Databricks / Apache Spark; familiarity with Hadoop and Kafka is a plus; demonstrable production performance tuning experience.
  • 3+ years of experience in designing and implementing metadata-driven, pattern-based ETL/ELT frameworks.
  • 3+ years working with AWS data services and core managed services (S3, VPC, IAM, KMS, Secrets Manager, EC2) and cloud data lake/warehouse concepts.
  • 3+ years` implementing CI/CD and DevOps practices for data workloads (GitHub/GitLab, Terraform, Jenkins or equivalent
  • 3+ years` experience with orchestration tools (Airflow, Autosys, Databricks Workflows).
  • Hands‑on experience with ingestion patterns: batch, streaming, and CDC
  • Strong skills in performance tuning and optimization of new and migrated data pipelines

Nice To Haves

  • 5+ years` exposure to data security, governance, and compliance practices (encryption, RBAC, metadata management); familiarity with FedRAMP, NIST, and GDPR.
  • Experience migrating medium‑to‑large pipelines to cloud — include scale if possible (e.g., TBs/day, number of pipelines).
  • Familiarity with observability and lineage tooling (Datadog, Prometheus, OpenLineage, Unity Catalog, etc.).
  • Experience with Agile software development lifecycle and tooling (ADO, JIRA)

Responsibilities

  • Lead large scale ETL modernization initiative migrating legacy pipelines (like DataStage, GoldenGate, HVR, etc.,) to a scalable, configuration-driven, metadata-based ETL framework, and ensure adherence to data governance, security, and compliance standards.
  • Lead the implementation of a metadata‑driven, reusable ETL framework on AWS cloud data platform and champion repeatable, self‑service cloud and data architecture patterns that enable teams to deploy scalable, high‑performant, maintainable, and compliant data pipelines autonomously across the enterprise.
  • Lead end-to-end data integration and ETl/ELT processes to ingest, transform and deliver complex structured and unstructured data into a governed Data Lakehouse, enabling seamless access for analytics, reporting and data science workloads.
  • Designing and solutioning cloud-native & Cloud agnostic data platforms and data engineering solution on AWS, and experience in SaaS products like Databricks to ensure portability, resilience and consistent governance across environments
  • Drive automation, DevOps/DevSecOps, and Infrastructure as Code (IaaC) initiatives to deliver repeatable, testable, and deployable artifacts and accelerate migrations.
  • Troubleshoot and resolve implementation issues throughout the SDLC; monitor architecture compliance and operational health.
  • Design and configure data pipelines with enterprise orchestration and scheduling tools, and establish monitoring, alerting, and operational runbooks for production support.
  • Provide technical leadership, mentorship, and guidance to ETL engineering teams, provide best coding practices, enable team in automation strategies and tools, conduct peer reviews, and knowledge sharing across distributed teams.
  • Build and maintain strong relationships with vendors, partners, and cross‑functional teams, own stakeholder communications and collaboration channels, and drive accountability and organizational change through regular updates to product managers, DBAs, architects, and senior leadership.
  • Operationalize and standardize cloud platforms (AWS/Azure), applying architecture patterns, guardrails, and enterprise standards for scalability, reliability, security, compliance, and cost control.

Benefits

  • health insurance
  • flexible spending accounts
  • health savings accounts
  • retirement savings plans
  • life and disability insurance programs
  • paid and unpaid time away from work
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service