Data Engineer

ParsonsReston, VA

About The Position

Parsons is seeking a highly talented Data Engineer to join their team. This role involves supporting the design, implementation, and sustainment of the data plane for a large-scale distributed system, encompassing microservices, event processing, search/analytics, enterprise reporting, and operational telemetry. The engineer will work across both cloud-native and constrained edge-style deployments, ensuring data is reliably collected, transformed, moved, searched, and visualized even in challenging operating environments. This position is focused on data engineering across distributed microservices, event pipelines, search/analytics platforms, enterprise data integration, and resilient/manual data movement workflows, rather than traditional database administration. The role is part of the Federal Solutions team, which delivers resources to US government customers for global missions in defense, security, intelligence, infrastructure, and environmental sectors.

Requirements

  • Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, Information Systems, or related technical field (4 additional years of experience can substitute for a degree).
  • 10+ years of software and/or data engineering experience.
  • Strong experience in data engineering for distributed applications and microservices-based systems.
  • Hands-on experience with Java-based service ecosystems.
  • Hands-on experience with MongoDB.
  • Hands-on experience with Elasticsearch / ELK.
  • Hands-on experience with RabbitMQ or similar message/event platforms.
  • Good understanding of distributed systems concepts including eventual consistency, retries and idempotency, partition tolerance and degraded/offline operation, and data reconciliation and replay.
  • Experience developing and supporting data pipelines across both operational systems and analytics/reporting environments.
  • Experience designing robust data movement mechanisms, including fallback/manual transfer workflows for disrupted environments.
  • Experience with Power BI and/or Power Automate integration pipelines, including data shaping and feed preparation for dashboards and workflow automation.
  • Experience working with telemetry/log/metric pipelines and operational dashboarding concepts.
  • Ability to write clean technical documentation for schemas, data flows, and operating procedures.
  • Strong oral and written communication skills and the ability to work cooperatively and effectively as a team member.
  • An active Top Secret SCI security clearance is required for this position.

Nice To Haves

  • Master’s degree in Data Engineering, Computer Science, Analytics, or related field.
  • Experience with service platform tooling such as Consul, Nomad, and Vault.
  • Familiarity with legacy-to-modern data transition patterns, including relational to document-oriented migration approaches.
  • Experience supporting systems deployed in constrained environments with intermittent connectivity and variable infrastructure quality.
  • Experience with PostgreSQL and Grails/Java legacy systems in migration or sustainment contexts.
  • Experience with search engineering, semantic search, or cognitive search implementations using Elasticsearch.
  • Experience building data feeds that support operational dashboards, executive reporting, and data science workflows.
  • Familiarity with observability tooling transitions and telemetry normalization across multiple sources.
  • Ability to identify data architecture improvements, develop prototypes, and help build the case for operational enhancements.
  • Experience training users or technical teams as new data capabilities move into production.
  • Demonstrated success working across software, platform, analytics, and operations teams in complex technical environments.

Responsibilities

  • Designing, developing, and maintaining the data plane for a distributed microservices ecosystem consisting of approximately 25 core Java microservices and 25 ancillary integration microservices connecting to external systems.
  • Supporting data flows across a modern service platform built around MongoDB for operational data storage, Elasticsearch / ELK for analytics, search, cognitive/semantic search use cases, and telemetry analysis, and RabbitMQ for event-driven processing and message distribution, along with service orchestration and security components such as Consul, Nomad, and Vault.
  • Engineering data solutions that can operate across a wide range of deployment models: very small, constrained environments (e.g., a few bare-metal small-form-factor servers) and larger hybrid/cloud-native environments that can scale to hundreds of nodes.
  • Building and optimizing data pipelines that support operational application data flows, telemetry ingestion (hot/warm/cold paths), dashboard/reporting data feeds, and enterprise data integration across inventory, asset, and financial systems.
  • Developing resilient mechanisms for manual or semi-manual data extraction, transfer, and re-ingestion when automated connectivity is degraded or unavailable due to technical, physical, or geopolitical disruptions.
  • Supporting data migration activities between legacy (Java/Grails using PostgreSQL) and modernized (MongoDB) platforms, working within existing migration processes and improving tooling, reliability, and observability where appropriate.
  • Building and maintaining integrations that feed Power BI dashboards, Power Automate workflows, and downstream data science/analytics use cases.
  • Enabling operational visibility through telemetry engineering: ingesting and transforming logs, events, and metrics, supporting observability pipelines based on ELK and current/transitioning monitoring tooling, and helping shape future-state consolidation toward a more ELK-centric observability model.
  • Collaborating closely with software engineers, platform engineers, SRE/operations, data scientists, and enterprise stakeholders to operationalize data and deliver reliable insights.
  • Documenting data models, interfaces, schemas, transformations, operational procedures, and recovery workflows.

Benefits

  • medical
  • dental
  • vision
  • paid time off
  • Employee Stock Ownership Plan (ESOP)
  • 401(k)
  • life insurance
  • flexible work schedules
  • holidays
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service