AWS Data Engineer

Pivotal ConsultingSeattle, WA
18h$50 - $70Hybrid

About The Position

AWS Data Engineer Seattle, WA (Hourly W2; Hybrid) Applicants must be currently authorized to work in the United States on a full-time basis. The employer will not sponsor applicants for work visas. The employer may not have resources available to support STEM OPT training requirements. No C2C - Pivotal does not accept unsolicited applications or resumes from third-party recruiters/agencies. Why clients choose Pivotal Consulting: We are a technology management consulting firm helping Fortune 500 companies improve their performance – we specialize in making People, Process, and Technology work together! Our clients count on us to deliver excellence and seek our guidance on business and technology strategy, technology modernization, and cloud transformation initiatives. Simply put; by listening to our clients closely and focusing on delivering quality, we bring them peace of mind. After guiding and helping numerous clients from global enterprises to mid-market firms to non-profit organizations, we are now experiencing breakthrough growth! What we are looking for: Pivotal Consulting is seeking an experienced AWS Data Engineer to support a major government data modernization initiative. The ideal candidate will design, build, and optimize scalable data pipelines and cloud-based data solutions using AWS native services. This role requires strong collaboration with analysts, developers, and business stakeholders to ensure reliable, secure, and high-quality data delivery across multiple systems..

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, or related field (or equivalent experience).
  • 5+ years of experience in data engineering with cloud-based architectures.
  • 3+ years of hands-on experience with AWS services for data integration, storage, and analytics.
  • Strong programming proficiency in Python and SQL.
  • Experience with data modeling and relational database concepts.
  • Familiarity with ETL orchestration tools and serverless data workflows.
  • Experience implementing secure, compliant solutions in federal or regulated environments.

Nice To Haves

  • AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect certification.
  • Prior experience supporting federal data initiatives or DoD/Intel contracts.
  • Exposure to tools like Apache Spark, Airflow, or Snowflake.
  • Knowledge of DevOps practices and CI/CD automation in a cloud environment.

Responsibilities

  • Design, develop, and maintain data pipelines using AWS Glue, Lambda, Step Functions, and S3.
  • Implement and manage large-scale data lake and data warehouse environments (e.g., Redshift, Athena).
  • Leverage ETL/ELT frameworks for ingesting and transforming structured and unstructured data.
  • Develop and maintain infrastructure-as-code (IaC) using Terraform or AWS CloudFormation.
  • Collaborate with data analysts, BI developers, and cloud architects to optimize performance and cost efficiency.
  • Ensure data security, compliance, and governance across all AWS environments in accordance with government standards.
  • Troubleshoot and optimize data workflows for performance, scalability, and cost.
  • Document architecture, processes, and code for maintainability.

Benefits

  • Pivotal Consulting offers a comprehensive benefit package, including medical, dental and vision insurance, and 401k.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service