For the OPS Consulting team, ‘the power to help’ means helping our clients, helping serve the mission, helping our employees and their families, and helping the community. Headquartered in Hanover, MD. OPS Consulting has over two decades of experience specializing in the most mission-critical operations. We are thought leaders and innovators. The ingenuity of our developers, engineers, cyber experts, linguists, and analysts are dedicated to empowering our clients, fulfilling The Mission, and remaining trusted leaders and advisers in national security and technology solutions. We are seeking Software Engineers to design, develop, test, deploy, document, maintain, and enhance complex software systems supporting a secure, large-scale, multi-tenant Linux computing environment. These systems include high-performance compute platforms, distributed data processing pipelines, workflow orchestration services, operational data flows, and full-stack web applications that manage access to enterprise resources. Work may involve processing-intensive analytics, automation tooling, containerized microservices, real-time systems, data repositories, and platform integrations. Engineers work independently or as part of Agile teams to deliver secure, scalable, and reliable software solutions supporting mission-critical workloads. Multiple Openings We are hiring multiple Software Engineers at varying levels (SWE-1 through SWE-4). While all roles operate within the same secure Linux and Kubernetes-based ecosystem, each position has a primary technical focus area. Engineers are aligned with a single core focus area based on their experience and interests. Candidates are not expected to perform across all areas listed below. Current focus areas include: Platform & DevOps Engineering Develop automation using Bash and Python Implement Infrastructure-as-Code using tools such as Ansible Design and maintain CI/CD pipelines (GitLab CI, Jenkins, etc.) Improve release management, build reliability, and deployment processes Data Engineering & Distributed Processing Design and optimize data ingress and egress mechanisms Develop distributed processing jobs using Apache Spark Implement data validation, transformation, and anomaly detection Work with structured and semi-structured data formats (Parquet, JSON, CSV, XML) Workflow Orchestration Design and maintain Apache Airflow DAGs Implement observable, scalable workflow patterns Integrate distributed processing engines and containerized runtimes Full-Stack & API Development Develop web interfaces using JavaScript, HTML, CSS, and modern frameworks (e.g., React) Design REST APIs using Python frameworks (e.g., FastAPI) Implement secure authentication and authorization integrations Develop relational and non-relational database applications Containerized Application Engineering Develop Python-based services Integrate with MongoDB, PostgreSQL, and messaging frameworks (e.g., RabbitMQ, Kafka) Deploy services using Docker, containerd, Podman, and Kubernetes Create and maintain Helm charts Dataflow Platform Engineering Design and administer Apache NiFi data flows Create and troubleshoot complex operational data transport pipelines Implement secure and compliant data handling practices
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Entry Level