Dataflow Systems Engineer

Parsons Corporation•Annapolis Junction, MD
1d

About The Position

Parsons is looking for someone with dataflow engineering experience to join our team. In this role you will get to support key activities in defense of emerging threats in the cybersecurity domain by designing, implementing, and managing secure cloud solutions to optimize performance, security, and compliance. You will work as part of a team that develops and tailors capabilities with the goal to prevent and eradicate threats to critical U.S. systems.

Requirements

  • Must have a Bachelor of Science in Computer Science, Cybersecurity, Information Systems, or related field.
  • 10+ years of experience in technical role such as Cloud Engineer, Software Engineer, Systems Engineer, DevOps Engineer, or Application Developer.
  • Experience deploying and configuring enterprise services on classified networks
  • Experience installing, configuring, deploying and maintaining enterprise services
  • Experience running CI/CD life cycles in a multi-environment situation
  • Experience writing scripting language (e.g. Bash, Python, etc.) for automating and scripting tasks.
  • Understanding of Linux environments
  • U.S. Citizenship
  • Active TS/SCI security clearance with polygraph

Nice To Haves

  • Strong ability to grasp new technologies and acquire new skills through independent study, professional training, and interaction with other team members
  • Ability to communicate with technical team members, managers, and customers
  • Ability to automate workflows and processes using scripts and tools

Responsibilities

  • Design and implement ETL pipelines using tools like Python, Apache NiFi and Kafka to move high volume data efficiently.
  • Participate in technical/architectural efforts to ensure data is properly transported between different data stores and systems and make recommendations to optimize infrastructure costs associated with data storage.
  • Translate data across multiple fabrics in an environment that requires varying levels of classification ensuring data is structured in formats for analytic consumption.
  • Mange data pipelines to optimize cloud performance and implement best practices to ensure data flow meets governance, security and compliance standards.
  • Support program needs for artifacts and documentation of dataflows and infrastructure diagrams.
  • Enhance CI/CD pipelines and DevOps workflows using available tools.
  • Work as part of an extended team to define and design systems to meet information protection needs and continually assess systems effectiveness in defense of emerging threats.
  • Collaborate in agile scrum cycles, contributing to design sessions, conducting peer reviews, and providing support for testing and documentation

Benefits

  • medical
  • dental
  • vision
  • paid time off
  • 401(k)
  • life insurance
  • flexible work schedules
  • holidays
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service