Senior Data Engineer

Volume IntegrationReston, VA
6h

About The Position

Turn complex data into powerful pipelines that drive smarter decisions. NS2 Mission is hiring a Senior Data Engineer to join our growing team in Reston, VA. This role is ideal for an experienced engineer who enjoys building scalable, high-performance data solutions, working alongside analysts and software engineers, and contributing to mission-impacting systems. You will play a key role in designing, building, and maintaining robust data pipelines and backend services that enable efficient data ingestion, processing, and accessibility at scale, supporting analytics, reporting, and advanced technologies through strong Java-based engineering and collaborative problem solving.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field (or equivalent experience)
  • 6–10 years of professional experience in data engineering or backend engineering roles
  • Strong hands-on experience with Java, including Spring Boot and building RESTful APIs
  • Experience designing and maintaining ETL/ELT pipelines and data workflows
  • Proficiency in SQL and experience working with relational and/or NoSQL databases
  • Hands-on experience modeling large datasets for analytics and operational use cases
  • Experience working in cloud environments (AWS, Azure, or GCP)
  • Understanding of data governance, security, and access control best practices
  • Strong problem-solving skills and the ability to communicate technical concepts clearly

Nice To Haves

  • Experience with Python for data processing or pipeline development
  • Experience with ElasticSearch or OpenSearch
  • Experience with big data technologies (e.g., Spark, Hadoop, Kafka)
  • Familiarity with containerization and orchestration tools (Docker, Kubernetes, etc.)
  • Exposure to CI/CD pipelines and DevOps practices
  • Experience supporting analytics, search, or machine learning workloads
  • Knowledge of integrating data platforms with visualization or reporting tools

Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes
  • Build and support Java-based data services using Spring Boot and RESTful APIs
  • Integrate data from multiple structured and unstructured data sources
  • Model, store, and optimize large datasets for performance, scalability, and reliability
  • Ensure data quality, integrity, and adherence to governance and security best practices
  • Collaborate with data scientists, analysts, and software engineers to support analytics and mission initiatives
  • Monitor, troubleshoot, and optimize pipeline and service performance
  • Document data architectures, flows, and operational processes
  • Support deployment, operations, and maintenance of cloud-based data solutions
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service