Software Developer (Java, API)

Ardent PrinciplesChantilly, VA
1dOnsite

About The Position

We are seeking a Software Developer with strong experience in Java development, hands‑on skill in designing, building, maintaining, and troubleshooting APIs, and proven capability working with Python to support data‑driven applications. The ideal candidate brings demonstrated Big Data experience, contributing to scalable processing pipelines and high‑volume data environments. Who We Are: We offer advanced services in data science, data engineering, software engineering, AI solutions, cybersecurity, staff augmentation, and IT program management. Passionate Integrity, Driven by Excellence Ardent Principles offers a competitive salary range and a comprehensive, industry‑leading benefits package designed to support long‑term stability and employee well-being. We provide more than a position—we offer a workplace committed to excellence, integrity, and mission‑focused impact. Our mission is to act as a bridge between satisfied clients and fulfilled employees, ensuring that your job and well-being are our top priorities because your satisfaction leads to the success of our clients. Join us as we continue building the future of secure, high‑impact solutions.

Requirements

  • Active TS/SCI with Full Scope Polygraph
  • Java development
  • Big Data technologies, to include designing and operating Big Data systems
  • API development, maintenance, and troubleshooting
  • Databases: Postgres, MariaDB, ELK, Minio, AWS S3, Neo4j, MongoDB, noSQL
  • Languages: Python (pypi libraries)
  • Operating Systems: Centos7, RockyLinux8
  • Orchestration: Kubernetes, Docker, Docker-Compose, Docker-Swarm
  • Development Tools: vscode, gitlab, jupyterhub/notebooks, MATLAB
  • Environments: large collaboration and development environments
  • Data types: Unstructured, structured, or semi-structured data, including: CSV, JSON, JSONL, AVRO, Protocol Buffers, Parquet, etc.

Nice To Haves

  • Designing cloud-native architectures
  • Designing and operating big data systems
  • Developing and operating a graph traversal built upon Apache Gremlin
  • Building and operating high-performance data processing pipelines using Lambda, Step Functions and PySpark
  • Developing Machine Learning Operations (MLOps) pipelines for large-scale application
  • Understanding of IT Service Management and common SLA measurements
  • Presenting solutions, requirements, and presentations to diverse audiences.
  • Working with container orchestration technologies such as AWS ECS, AWS Fargate, and Kubernetes or other enhanced capabilities available
  • Managing large operational cloud environments spanning multiple tenants using Multi-Account management, AWS Well Architected Best Practices, and AWS Organization Units/Service Control Policies (OU/SCP).
  • Micro-services such as building decoupled systems, utilizing RESTful endpoints and lightweight systems
  • Working in total systems perspectives, including a technical understanding of systems and applications relationships, dependencies, and requirements of hardware and software components
  • Consulting with customers to determine present and future user needs
  • Providing frequent contact with customers, traceability within program documents, and the overall computing environment and architecture
  • Certifications: AWS Certified Solutions Architect
  • AWS Machine Learning Certification(s)
  • Agile certification
  • Azure Security+
  • GSEC

Responsibilities

  • Designing cloud-native architectures using cloud services such as AWS, Google, IBM, and Oracle
  • Building and optimizing performance of large-scale graph databases (tens of billions of edges) using DynamoDB or new enhanced capabilities
  • Developing and operating graph traversal capabilities using data graphing tool traversal capabilities built upon Apache Gremlin or new enhanced capabilities
  • Developing and operating NoSQL solutions to complex big data applications
  • Data modeling for performance, partition sharding, record/event aggregation workflows, stream processing, and metrics gathering
  • Designing and operating large-scale serverless geospatial indexes built with GeoMESA
  • Partition and sort key design and implementation to ensure consistent performance
  • Aggregation operations to de-duplicate records on continuous data feeds
  • Subject matter expertise experience with relational databases to noSQL
  • Building and operating high performance data processing pipelines using Lambda, Step Functions and PySpark
  • Building high quality User Interface/User experiences with the React framework and webGL
  • Designing and operating large scale graph databases using Apache Cassandra
  • Performing in-depth technical analysis of large-scale graph databases to develop implementation strategies for search optimizations
  • Developing technical capabilities for processing, persistence and search of datasets that are collected or maintained using standards
  • Facilitating engineering discussions across teams representing multiple stakeholders to develop and execute implementation strategies that meet mission needs
  • Developing Machine Learning Operations (MLOps) pipelines for large scale application
  • Maintaining configuration of software using configuration management resources such as GitHub
  • Designing, building and operating big data systems, such as persistence, partitioning, indexing, at scale of trillions of records/events
  • Niagara Files (NiFi) applications or new enhanced capabilities
  • Developing and operating Kubernetes infrastructure
  • Supporting engineering efforts that will contribute to delivery of capabilities such as datasets and functionality such as communications, geospatial workflows
  • Implementing DevSecOps and agile development in production environments
  • Agile software development and testing
  • Federal security, regulatory and compliance requirements and security accreditation package development
  • Data security and governance using centralized security controls like LDAP, encrypting the data, and auditing access to the data
  • Specialized technologies that are optimized for the particular use of the data, such as relational databases, a NoSQL database (Cassandra), or object storage
  • Apache, TINKERPOP, GREMLIN and/or JANUSGRAPH to design, develop, implement and maintain system
  • Knowledge of Graph Database to design, develop, implement and maintain system
  • C or C++ to write interfaces
  • Using centralized security controls like LDAP, encrypting data, and auditing access to data

Benefits

  • Highly Competitive Salary : Recognizing and rewarding your expertise and contributions.
  • Generous Paid Time Off : Providing ample time for rest, relaxation, and personal pursuits.
  • Dedicated Training Budget : Supporting continuous learning and professional development.
  • 100% Employer-Covered Family Vision, Dental, and Health Insurance : Ensuring comprehensive health coverage for you and your family.
  • 100% Employer-Covered Life and Disability Insurance : Offering financial security and peace of mind.
  • 401(k) Plan with a 6% Employer Match : Helping you plan and save for a secure retirement, with 100% vesting from day one.
  • 11 Paid Government Holidays : Observing national holidays to ensure time off with family and friends.
  • Spot Bonuses for Exceptional Performance : Rewarding outstanding contributions and achievements.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service