Cloud Software Engineer 3

Avid Technology ProfessionalsAnnapolis Junction, MD

About The Position

Twelve (12) years experience software engineering experience in programs and contracts of similar scope, type, and complexity is required. Bachelors degree in Computer Science or related discipline from an accredited college or university is required; four (4) years of which must be in programs utilizing Big-Data cloud technologies and/or Distributed Computing. Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. 2. The following Cloud related experiences are required: 3. a. Two (2) years of Cloud and/or Distributed Computing Information Retrieval (IR). 4. b. One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table. 5. c. One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System. 6. d. One (1) year of experience with implementing complex MapReduce analytics. 7. e. One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. 8. f. One (1) year of experience in architecting Cloud Computing solutions 9. g. One (1) year of experience in debugging problems with Cloud based Distributed Computing Frameworks 10. h. One (1) year of experience in managing multi-node Cloud based installation 11. Experience in Computer Network Operations: 12. a. Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing 13. b. Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies. 14. Experience in Information Assurance: Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s) 15. Experience in Information Technology: 16. a. Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services. 17. b. Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. 18. c. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB 19. d. Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies 20. e. Aspect Oriented Design and Development 21. f. Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications 22. g. UNIX/LINUX, CentOS 23. Experience in SIGINT: 1. Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT) 2. Geolocation, emitter identification, and signal applications. 3. Joint program collection platforms and dataflow architectures; signals characterization analysis 24. Experience with Other: 1. CentOS, Linux/RedHat, 2. Configuration management tools such as Subversion, ClearQuest, or Razor.

Requirements

  • Twelve (12) years experience software engineering experience in programs and contracts of similar scope, type, and complexity is required.
  • Bachelors degree in Computer Science or related discipline from an accredited college or university is required; four (4) years of which must be in programs utilizing Big-Data cloud technologies and/or Distributed Computing.
  • Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelors degree.
  • Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience.
  • Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience.
  • Two (2) years of Cloud and/or Distributed Computing Information Retrieval (IR).
  • One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table.
  • One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System.
  • One (1) year of experience with implementing complex MapReduce analytics.
  • One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks.
  • One (1) year of experience in architecting Cloud Computing solutions
  • One (1) year of experience in debugging problems with Cloud based Distributed Computing Frameworks
  • One (1) year of experience in managing multi-node Cloud based installation
  • Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing
  • Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies.
  • Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s)
  • Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services.
  • Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase, JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies.
  • Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB
  • Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies
  • Aspect Oriented Design and Development
  • Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications
  • UNIX/LINUX, CentOS
  • Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT)
  • Geolocation, emitter identification, and signal applications.
  • Joint program collection platforms and dataflow architectures; signals characterization analysis
  • CentOS, Linux/RedHat,
  • Configuration management tools such as Subversion, ClearQuest, or Razor.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service