Photon Career Site-posted 3 months ago
$36,000 - $126,000/Yr
Full-time
5,001-10,000 employees

This job description is for a Full Stack Engineer to work on a cloud-native platform modernization project. The successful candidate will be responsible for designing, developing, and implementing a scalable, resilient, and secure AWS-based system to replace a legacy platform.

  • Build a scalable, resilient, and secure AWS-based data ingestion platform.
  • Develop reusable data ingestion pipelines to support current and future data sources.
  • Implement both real-time and batch data processing capabilities.
  • Implement a mechanism to identify if a security needs to be created or updated.
  • Develop a vendor data transitional load mechanism to temporarily store data.
  • Implement a purge mechanism to remove data from the transitional load table based on security creation or updating.
  • Enhance the Mainframe DB2 data flow to consume data from newly added files.
  • Provision AWS infrastructure using Infrastructure as Code (IaC) tools and ensure it meets security and compliance standards.
  • Implement robust security controls, including IAM, encryption, audit logging, and compliance configurations.
  • Set up observability tools like centralized logging, monitoring, and alerting.
  • Build CI/CD pipelines for automated deployment and rollback.
  • Develop and execute a phased migration plan to minimize disruption.
  • Build and validate a reconciliation system to ensure data integrity between the legacy platform and the new system.
  • Decommission legacy files after a successful 60-day parallel processing period.
  • Collaborate with internal stakeholders to define the target AWS architecture and ensure alignment with enterprise security and data governance standards.
  • Develop operational documentation, including architecture diagrams, runbooks, and SOPs.
  • Conduct DR drills and support user acceptance testing (UAT).
  • Extensive experience with AWS services, including provisioning infrastructure, setting up security controls, and working with cloud-native solutions.
  • Proven experience building scalable data ingestion pipelines and implementing both real-time and batch data processing capabilities.
  • Strong skills in developing back-end solutions, particularly for data processing, data storage (transitional load), and interfacing with legacy systems like Mainframe DB2.
  • Hands-on experience with CI/CD pipelines and Infrastructure as Code (IaC) tools.
  • Knowledge of implementing security controls like IAM, encryption, and audit logging.
  • Experience with observability tools for logging, monitoring, and alerting.
  • Ability to analyze requirements and mappings for data ingestion, define technical tasks, and create agile backlogs (Epics, stories).
  • Medical, vision, and dental benefits
  • 401k retirement plan
  • Variable pay/incentives
  • Paid time off
  • Paid holidays
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service