Big Data Software Engineer (Java & Cloud Focus)- Fully Cleared

Intelliforce-IT Solutions GroupColumbia, MD
1d$176,000 - $226,000Onsite

About The Position

At Intelliforce, we build the technologies that enable the mission to move faster, smarter, and with confidence. As a Software Engineer on this team, you will work at the intersection of large-scale data processing and cloud infrastructure, developing systems that turn massive datasets into actionable intelligence. If you enjoy solving hard problems, writing efficient code, and seeing your work directly support national security operations, this role offers meaningful impact and technical depth. You will collaborate with experienced engineers, data specialists, and mission partners to design and deliver scalable solutions that operate in demanding environments. Your day will center on building and enhancing high-performance software that processes and analyzes large volumes of data. You might start by refining a Java service that powers a distributed analytics pipeline, then shift to optimizing MapReduce jobs to improve performance across a Hadoop cluster. Throughout the day, you will collaborate with teammates to troubleshoot data processing issues, integrate cloud-based capabilities, and ensure systems remain reliable and scalable. You will also contribute to new feature development, system improvements, and integration of mission-specific technologies such as GhostMachine and QTA, helping teams extract value from complex datasets quickly and securely.

Requirements

  • Active Top Secret Clearance with Full Scope Polygraph (required)
  • Must be a U.S. Citizen
  • Bachelor’s degree in Computer Science or a related discipline from an accredited college or university
  • Four (4) additional years of relevant Software Engineering experience may be substituted for a bachelor’s degree
  • Fourteen (14) years of Software Engineering experience supporting programs of similar scope, type, and complexity
  • Strong Java development experience
  • Experience with MapReduce programming models and technologies
  • Cloud computing experience
  • Experience with distributed data processing systems
  • Familiarity with GhostMachine and QTA tools
  • Ability to develop, debug, and enhance complex software systems
  • Experience working with large datasets and scalable architectures
  • Strong problem-solving skills and ability to work in collaborative environments

Nice To Haves

  • Experience with Hadoop ecosystem technologies
  • Familiarity with HDFS and NoSQL data stores
  • Experience with data serialization formats such as JSON or BSON
  • Experience optimizing performance of distributed systems
  • Background in mission or analytics-focused environments

Responsibilities

  • Refining a Java service that powers a distributed analytics pipeline
  • Optimizing MapReduce jobs to improve performance across a Hadoop cluster
  • Troubleshooting data processing issues
  • Integrating cloud-based capabilities
  • Ensuring systems remain reliable and scalable
  • Contributing to new feature development
  • System improvements
  • Integration of mission-specific technologies such as GhostMachine and QTA

Benefits

  • Ample PTO to rest and recharge—plus all federal holidays and your birthday off, just because.
  • Multiple medical plan options, including ones with zero deductible or premium for employees.
  • Generous 401(k) with immediate vesting—because your future matters now.
  • Exciting bonus opportunities, from profit sharing to quarterly awards and President’s Club recognition.
  • A culture of collaboration, connection, and fun, with regular team activities that go beyond the work.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service