Software Engineer

Ardent Principles, Inc.Chantilly, VA
12hOnsite

About The Position

REQUIRED: Active TS/SCI with Full Scope Polygraph LOCATION: Full-time, onsite in Chantilly, VA We are seeking a highly skilled Software Engineer with deep experience building and operating large‑scale, cloud‑native data systems that power mission-critical analytics and graph-based applications. The ideal candidate is fluent in Java and Python, comfortable designing and optimizing Big Data architectures, and capable of developing resilient APIs and data processing pipelines Who We Are: We offer advanced services in data science, data engineering, software engineering, AI solutions, cybersecurity, staff augmentation, and IT program management. Passionate Integrity, Driven by Excellence "Ardent Principles" signifies our unwavering commitment to excellence, driven by a profound passion and a strict adherence to ethical values. We believe that happy employees make for happy clients. Our mission is to act as a bridge between satisfied clients and fulfilled employees, ensuring that your job and well-being are our top priorities because your satisfaction leads to the success of our clients. With a competitive salary range and industry-leading benefits, Ardent Principles offers more than just a job - we offer a career path filled with growth and opportunities. Join us and let's shape the future together!

Requirements

  • Active TS/SCI with Full Scope Polygraph
  • Java development.
  • Big Data technologies, to include designing and operating Big Data systems.
  • API development, maintenance, and troubleshooting.
  • Python.
  • Designing cloud-native architectures using cloud services such as AWS, Google, IBM, and Oracle.
  • Building and optimizing performance of large-scale graph databases (tens of billions of edges) using DynamoDB or new enhanced capabilities.
  • Developing and operating graph traversal capabilities using data graphing tool traversal capabilities built upon Apache Gremlin or new enhanced capabilities.
  • Developing and operating NoSQL solutions to complex big data applications.
  • Data modeling for performance, partition sharding, record/event aggregation workflows, stream processing, and metrics gathering.
  • Designing and operating large-scale serverless geospatial indexes built with GeoMESA.
  • Partition and sort key design and implementation to ensure consistent performance.
  • Aggregation operations to de-duplicate records on continuous data feeds.
  • Subject matter expertise experience with relational databases to noSQL.
  • Building and operating high performance data processing pipelines using Lambda, Step Functions and PySpark.
  • Building high quality User Interface/User experiences with the React framework and webGL.
  • Designing and operating large scale graph databases using Apache Cassandra.
  • Performing in-depth technical analysis of large-scale graph databases to develop implementation strategies for search optimizations.
  • Developing technical capabilities for processing, persistence and search of datasets that are collected or maintained using standards common in the Sponsor's community.
  • Facilitating engineering discussions across teams representing multiple stakeholders to develop and execute implementation strategies that meet mission needs.
  • Developing Machine Learning Operations (MLOps) pipelines for large scale application.
  • Maintaining configuration of software using configuration management resources such as GitHub.
  • Designing, building and operating big data systems, such as persistence, partitioning, indexing, at scale of trillions of records/events.
  • Niagara Files (NiFi) applications or new enhanced capabilities.
  • Developing and operating Kubernetes infrastructure.
  • Supporting engineering efforts that will contribute to delivery of capabilities such as datasets and functionality such as communications, geospatial workflows.
  • Implementing DevSecOps and agile development in production environments.
  • Agile software development and testing.
  • Federal security, regulatory and compliance requirements and security accreditation package development.
  • Data security and governance using centralized security controls like LDAP, encrypting the data, and auditing access to the data.
  • Specialized technologies that are optimized for the particular use of the data, such as relational databases, a NoSQL database (Cassandra), or object storage.
  • Apache, TINKERPOP, GREMLIN and/or JANUSGRAPH to design, develop, implement and maintain system.
  • Knowledge of Graph Database to design, develop, implement and maintain system.
  • Using C or C++ to write interfaces
  • Using centralized security controls like LDAP, encrypting data, and auditing access to data.
  • Databases: Postgres, MariaDB, ELK, Minio, AWS S3, Neo4j, MongoDB, noSQL.
  • Languages: Python (pypi libraries).
  • Operating Systems: Centos7, RockyLinux8.
  • Orchestration: Kubernetes, Docker, Docker-Compose, Docker-Swarm.
  • Development Tools: vscode, gitlab, jupyterhub/notebooks, MATLAB.
  • Environments: large collaboration and development environments.
  • Data types: Unstructured, structured, or semi-structured data, including: CSV, JSON, JSONL, AVRO, Protocol Buffers, Parquet, etc.

Nice To Haves

  • Designing cloud-native architectures using the Sponsor's cloud services.
  • Designing and operating big data systems within the Sponsor's policy and regulatory environment.
  • Developing and operating graph traversal capabilities using the Sponsor's data graphing tool traversal capabilities built upon Apache Gremlin.
  • Building and operating high performance data processing pipelines using Lambda, Step Functions and PySpark on the Sponsor's infrastructure with EMR.
  • Working with the Sponsor's enterprise services used for Data Management, including the enterprise catalog service (and associated APIs), and Policy Decision Points (PDPs).
  • Developing Machine Learning Operations (MLOps) pipelines for large scale application in the Sponsor's environment.
  • Understanding of IT Service Management and common SLA measurements.
  • Presenting solutions, requirements, and presentations to diverse audiences.
  • Working with container orchestration technologies such as AWS ECS, AWS Fargate, and Kubernetes or other enhanced capabilities available.
  • Managing large operational cloud environments spanning multiple tenants using Multi-Account management, AWS Well Architected Best Practices, and AWS Organization Units/Service Control Policies (OU/SCP).
  • Micro-services such as building decoupled systems, utilizing RESTful endpoints and lightweight systems.
  • Total systems perspectives, including a technical understanding of systems and applications relationships, dependencies, and requirements of hardware and software components.
  • Consulting with customers to determine present and future user needs.
  • Providing frequent contact with customers, traceability within program documents, and the overall computing environment and architecture.
  • DESIRED Certifications: AWS Certified Solutions Architect AWS Machine Learning Certification(s) Agile certification Azure Security+ GSEC CCNA

Responsibilities

  • Java development.
  • Big Data technologies, to include designing and operating Big Data systems.
  • API development, maintenance, and troubleshooting.
  • Python.
  • Designing cloud-native architectures using cloud services such as AWS, Google, IBM, and Oracle.
  • Building and optimizing performance of large-scale graph databases (tens of billions of edges) using DynamoDB or new enhanced capabilities.
  • Developing and operating graph traversal capabilities using data graphing tool traversal capabilities built upon Apache Gremlin or new enhanced capabilities.
  • Developing and operating NoSQL solutions to complex big data applications.
  • Data modeling for performance, partition sharding, record/event aggregation workflows, stream processing, and metrics gathering.
  • Designing and operating large-scale serverless geospatial indexes built with GeoMESA.
  • Partition and sort key design and implementation to ensure consistent performance.
  • Aggregation operations to de-duplicate records on continuous data feeds.
  • Subject matter expertise experience with relational databases to noSQL.
  • Building and operating high performance data processing pipelines using Lambda, Step Functions and PySpark.
  • Building high quality User Interface/User experiences with the React framework and webGL.
  • Designing and operating large scale graph databases using Apache Cassandra.
  • Performing in-depth technical analysis of large-scale graph databases to develop implementation strategies for search optimizations.
  • Developing technical capabilities for processing, persistence and search of datasets that are collected or maintained using standards common in the Sponsor's community.
  • Facilitating engineering discussions across teams representing multiple stakeholders to develop and execute implementation strategies that meet mission needs.
  • Developing Machine Learning Operations (MLOps) pipelines for large scale application.
  • Maintaining configuration of software using configuration management resources such as GitHub.
  • Designing, building and operating big data systems, such as persistence, partitioning, indexing, at scale of trillions of records/events.
  • Niagara Files (NiFi) applications or new enhanced capabilities.
  • Developing and operating Kubernetes infrastructure.
  • Supporting engineering efforts that will contribute to delivery of capabilities such as datasets and functionality such as communications, geospatial workflows.
  • Implementing DevSecOps and agile development in production environments.
  • Agile software development and testing.
  • Federal security, regulatory and compliance requirements and security accreditation package development.
  • Data security and governance using centralized security controls like LDAP, encrypting the data, and auditing access to the data.
  • Specialized technologies that are optimized for the particular use of the data, such as relational databases, a NoSQL database (Cassandra), or object storage.
  • Apache, TINKERPOP, GREMLIN and/or JANUSGRAPH to design, develop, implement and maintain system.
  • Knowledge of Graph Database to design, develop, implement and maintain system.
  • Using C or C++ to write interfaces
  • Using centralized security controls like LDAP, encrypting data, and auditing access to data.

Benefits

  • Highly Competitive Salary: Recognizing and rewarding your expertise and contributions.
  • Generous Paid Time Off: Providing ample time for rest, relaxation, and personal pursuits.
  • Dedicated Training Budget: Supporting continuous learning and professional development.
  • 100% Employer-Covered Family Vision, Dental, and Health Insurance: Ensuring comprehensive health coverage for you and your family.
  • 100% Employer-Covered Life and Disability Insurance: Offering financial security and peace of mind.
  • 401(k) Plan with a 6% Employer Match: Helping you plan and save for a secure retirement, with 100% vesting from day one.
  • 11 Paid Government Holidays: Observing national holidays to ensure time off with family and friends.
  • Spot Bonuses for Exceptional Performance: Rewarding outstanding contributions and achievements.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

1-10 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service