Software Engineer (Data Analytics Focus)

Bigbear.aiColumbia, MD
105d

About The Position

BigBear.ai is seeking a Software Engineer. You will be a member of the Mission Intelligence (MI) team supporting a variety of data engineering, analysis, and automation efforts. The MI team maintains a business intelligence platform supporting data-driven decisions in the enterprise. Your primary responsibility will be to ensure the availability of this mission-essential platform. You will also contribute to the MI team as a software engineer and analytic developer where you will design, develop, and integrate solutions involving the analysis of large data sets to establish mission cognizance and deliver mission-centric insights.

Requirements

  • B.S degree in a technical discipline with 8 years of experience, or 4 years of additional experience in place of a B.S.
  • Clearance: TS/SCI w/ Poly.
  • Production-grade software development experience in Java and Python.
  • Experience deploying and maintaining Elastic Stack components (Elasticsearch, Kibana).
  • Familiarity with customer authentication and authorization platforms and standards.
  • Experience with one or more Extract, Transform, Load (ETL) or data engineering platforms (e.g., Apache NiFi, AirFlow).
  • Experience with service containerization and deployment with Docker/Kubernetes.
  • Familiarity with Git for software version control.
  • Experience with Atlassian Tools (Jira, Confluence).

Nice To Haves

  • Knowledge of map-reduce analytic environments (e.g., Hadoop).
  • Experience with cloud-based deployment environments (e.g., AWS).
  • Experience prototyping web applications (JavaScript).
  • Knowledge of end-to-end SIGINT collection and analysis systems.
  • Experience with production CNO capabilities and operations.

Responsibilities

  • Maintain and advance a mission-essential business intelligence platform.
  • Elicit requirements from stakeholders, gather and analyze available data, and develop solutions for delivering key metrics.
  • Develop analytics to combine, normalize, and enrich large data sets.
  • Design and implement Extract, Transform, Load (ETL) workflows to convert and normalize data.
  • Augment the platform with new tools or technologies.
  • Provide guidance to more junior software engineers and data scientists.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service