Data Engineering Specialist

Leidos
Onsite

About The Position

The Leidos Digital Modernization Group is seeking a Data Engineering Specialist to join the Global Management System (GMS) Team, supporting the Global Solutions Management – Operations II (GSM-O II) contract. This contract focuses on the Operations, Sustainment, Maintenance, Repair, and Defense of the Defense Information System Network (DISN) within the DOD Information Network (DODIN) for the Defense Information Systems Agency (DISA), including mission transformation and partner support. The role requires the candidate to be within commuting distance of Scott AFB or Ft. Meade and possess a Secret clearance and Security+ certification (or equivalent DoD 8570 IAT II) upon employment start. The specialist will contribute to data engineering activities, focusing on integrating and enriching DISN network topology data for advanced correlation and analytics. Key tasks include assisting in the design and implementation of data enrichment pipelines, integrating various data sources into Confluent (Kafka) and Elastic platforms, and maintaining Kafka and Elastic clusters for mission-critical operations. The position also involves platform sustainment, addressing operational challenges, and supporting the automation of the software development lifecycle, including CI/CD, containerization, and automated testing, adhering to DevOps best practices. The specialist will actively participate in Agile scrum teams, fostering collaboration and knowledge sharing, and will be responsible for developing and maintaining technical documentation to ensure compliance with DoD security standards.

Requirements

  • Bachelor’s degree from an accredited college in a related discipline, or equivalent experience/combined education, with 4–8 years of professional experience; or 2–6 years of professional experience with a related master’s degree.
  • 4+ years of experience in software engineering, data engineering, or business/data analysis, preferably within Agile/Scrum teams.
  • Hands-on software development experience with Python, Java, SQL, and working knowledge of JavaScript and HTML.
  • Experience with distributed version control systems such as Git and Bitbucket.
  • Proficiency with data analytics and visualization tools, such as Kibana, Power BI, Tableau, and the ELK stack (Elasticsearch, Logstash, Kibana).
  • Experience designing, developing, and optimizing ETL processes and data pipelines, including integration with event streaming platforms like Kafka.
  • Background in data modeling, unification, and analytics to support data-driven projects.
  • Experience implementing application and system integrations, including Kafka and Elastic platform integrations.
  • Understanding of networking and internet protocols, with experience supporting network-centric or data-driven environments.
  • Experience developing and deploying software on UNIX/Linux command line platforms.
  • Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders.
  • Experience with Agile project management and collaboration tools such as JIRA and Confluence.
  • Active Secret DoD Security clearance prior to start date.
  • Active Security+ Certification (or other applicable DoD 8570 IAT II certification) prior to start date.

Nice To Haves

  • Experience with CI/CD techniques, containerized pipelines, and DevOps practices, including automating software delivery processes.
  • Familiarity with artificial intelligence and machine learning concepts, and interest in supporting the integration of AI/ML capabilities into data platforms.
  • Experience with data integration, storage, and analysis technologies such as Kafka, Elastic, Spark, or NiFi.
  • Hands-on experience with Kafka connector integrations and working knowledge of ksqlDB and Kafka Streams for real-time data processing.
  • Ability to develop software designs for streaming data applications, particularly using Kafka Streams or ksqlDB.
  • Experience developing and optimizing Kafka system integrations between Elasticsearch/Logstash and other systems.
  • Experience designing and implementing application deployment pipelines and developing software in containerized environments using Kubernetes and Docker.
  • Familiarity with Kubernetes deployment, Agile methodologies, and collaborative development tools.
  • Experience developing and deploying software in AWS cloud environments, including basic configuration of cloud infrastructure, networking, and security policies (GovCloud experience a plus).
  • Experience with full software lifecycle automation (design, development, testing, deployment), including production deployments.
  • Experience designing and building automated software testing pipelines using tools such as Ansible, Selenium, JMeter, Junit, or similar.
  • Experience developing and deploying software in DoD environments (DISA experience a plus), including building applications that meet DoD security standards and implementing security guidelines (e.g., STIGs).
  • Ability to support the development of DoD requirements, traceability matrices, project plans/schedules, and contribute to software systems engineering documents and interface documents (IDDs/ICDs).
  • Experience with Agile methodologies and Atlassian tools, including JIRA and Confluence, for project tracking and collaboration.
  • Ability to work effectively in remote, geographically dispersed teams, demonstrating strong communication and collaboration skills.

Responsibilities

  • Contribute to data engineering efforts by supporting the integration and enrichment of DISN network topology data for advanced data correlation and analytics.
  • Participate in technical discussions with internal and external stakeholders to support solution design and implementation.
  • Develop, test, and deploy data pipelines and integration solutions across distributed systems and cloud environments, using Python, JavaScript, Java, and SQL.
  • Assist in requirements gathering and collaborate with stakeholders to design and implement data enrichment pipelines, integrating diverse data sources into Confluent (Kafka) and Elastic platforms.
  • Develop and maintain Kibana visualizations and dashboards to support operational insights.
  • Support Kafka system integrations between Elasticsearch/Logstash and other systems.
  • Collaborate within Agile scrum teams, contribute to team deliverables, and share knowledge with peers.
  • Communicate and coordinate effectively with geographically distributed team members to achieve project objectives.
  • Troubleshoot and help resolve installation, infrastructure, and system issues; report and help mitigate technical risks.
  • Develop and maintain technical documentation, including DoD requirements, interface documents, and security compliance artifacts.
  • Ensure solutions comply with DoD security standards and guidelines, and support platform sustainment and reliability by addressing operational challenges as needed.

Benefits

  • competitive compensation
  • Health and Wellness programs
  • Income Protection
  • Paid Leave
  • Retirement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service