Cloud Software Engineer 2

Avid Technology ProfessionalsAnnapolis Junction, MD

About The Position

Eight (8) years experience software engineering experience in programs and contracts of similar scope, type, and complexity is required; two (2) years of which must be in programs utilizing Big-Data Cloud technologies and/or Distributed Computing. Bachelors degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience. 2. The following Cloud related experiences are required: 3. a. Two (2) years of Cloud and Distributed Computing Information Retrieval (IR). 4. b. One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table. 5. c. One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System. 6. d. One (1) year of experience with implementing complex MapReduce analytics. 7. e. One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. Experience with Information Technology: 11. a. Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services. 12. b. Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase , JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies. 13. c. Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB 14. d. Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies 15. e. Aspect Oriented Design and Development 16. f. Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications 17. g. UNIX/LINUX, CentOS 18. Experience with SIGINT: 19. a. Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT) 20. b. Geolocation, emitter identification, and signal applications. 3. Joint program collection platforms and dataflow architectures; signals characterization analysis 21. Experience with Other: 22. a. CentOS and Linux/RedHat 23. b. Configuration management tools such as Subversion, ClearQuest, or Razor

Requirements

  • Eight (8) years experience software engineering experience in programs and contracts of similar scope, type, and complexity is required; two (2) years of which must be in programs utilizing Big-Data Cloud technologies and/or Distributed Computing.
  • Bachelors degree in Computer Science or related discipline from an accredited college or university is required.
  • Two (2) years of Cloud and Distributed Computing Information Retrieval (IR).
  • One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table.
  • One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System.
  • One (1) year of experience with implementing complex MapReduce analytics.
  • One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks.
  • Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services.
  • Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase , JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies.
  • Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB
  • Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies
  • Aspect Oriented Design and Development
  • Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications
  • UNIX/LINUX, CentOS
  • Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT)
  • Geolocation, emitter identification, and signal applications.
  • Joint program collection platforms and dataflow architectures; signals characterization analysis
  • CentOS and Linux/RedHat
  • Configuration management tools such as Subversion, ClearQuest, or Razor
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service