Data Engineer (Op AI)

PMATSan Diego, CA
1d$160,000 - $180,000Onsite

About The Position

PMAT is seeking a Data Engineer to support the design, development, and deployment of high-quality data pipelines and analytics for mission-focused applications. This role integrates data engineering, exploratory data analysis, distributed systems, and cloud-native technologies to deliver performant, reliable, and secure data capabilities. The Data Engineer will collaborate across multidisciplinary teams to ensure high-quality data output, robust testing, and operational readiness in support of DoD missions.

Requirements

  • Strong programming skills in Python, G, Rust, Pandas, R, SQL, or related languages
  • 3 years of experience as a business analyst, data analyst, data scientist, data engineer, database administrator, geospatial analyst/engineer, machine learning engineer, or software engineer, or related field.
  • Willingness to travel
  • 100% On Site willingness
  • Ability to safely carry tools, equipment, and materials aboard ship, including ascending and descending shipboard ladders(stairwells) and navigating confined spaces while maintaining required points of contact. Tools and equipment will weigh no more than 50 lbs.
  • Ability to perform required work aboard Navy vessels and in shipboard environments, including navigating narrow passageways, ascending, and descending ladders (stairwells), working on elevated platforms, and operating in variable sea conditions.
  • Ability to perform activities on a reoccurring basis during shipboard operations or testing evolutions.
  • Ability to comply with Navy safety requirements and wear required personal protective equipment (PPE).
  • Ability to operate in a DDIL office environment
  • Reasonable accommodations may be provided to enable qualified individuals to meet these requirements and perform the essential functions of the position.
  • US Citizenship
  • No dual citizenship
  • Active DoD TS clearance required

Nice To Haves

  • Experience with large-scale data architecture across secure DoD or government environments.
  • Experience supporting NAVWAR, NIWC Pacific, or other Navy programs.
  • Experience integrating production data pipelines and analytics into operational mission systems, including secure deployment, monitoring, troubleshooting, and performance tuning.
  • Familiarity with MLOps practices or deploying analytics/ML-enabled pipelines in classified, cross-domain, or constrained environments
  • Experience designing and/or normalizing data to common standards to support interoperability across teams/systems.
  • Experience working with multiple data formats (e.g., CSV, JSON, XML, Parquet, ORC)
  • Familiarity with event streaming platforms (e.g., Apache Kafka, RabbitMQ, ZeroMQ)
  • Experience with data pipeline frameworks and libraries (e.g., Apache Airflow, dbt, AirByte, Apache Iceberg, Snowflake or similar)
  • Experience with geospatial data retrieving and managing GIS data (e.g., ArcGIS, PostGIS)
  • Expertise with Elasticsearch, Redis, S3, PostgreSQL, or similar data stores.
  • Experience with distributed computing (AWS Lambda, DASK, Spark).
  • Familiarity with cloud platforms (AWS, Azure) and containerization (Docker, Kubernetes).
  • Understanding of cybersecurity principles as applied to data applications and operational environments (including DDIL constraints)
  • Ability to safely carry tools, equipment, and materials aboard ship, including ascending and descending shipboard ladders(stairwells) and navigating confined spaces while maintaining required points of contact. Tools and equipment will weigh no more than 50 lbs.
  • Ability to perform required work aboard Navy vessels and in shipboard environments, including navigating narrow passageways, ascending, and descending ladders (stairwells), working on elevated platforms, and operating in variable sea conditions.
  • Active DOD TS SCI preferred
  • Equivalent years of relevant experience in lieu of degree
  • Bachelor of Science in Computer Science, Data Science, Geography, Math, Machine Learning, or Statistics.
  • Additional certifications in cloud, data engineering, GIS, or cybersecurity are a plus if required by contract.

Responsibilities

  • Conduct data pre-processing, exploratory data analysis, and data pipeline engineering to ensure performant and high-quality data output.
  • Conduct thorough testing and validation of data pipelines and analytics to ensure accuracy, reliability, and robustness.
  • Design or normalize data to common standards to support interoperability and analytical workflows.
  • Develop and deploy data pipelines and analytics in real-world applications.
  • Work with multiple data formats, including CSV, JSON, XML, Parquet, and ORC.
  • Perform exploratory data analysis, algorithm development, and testing.
  • Deploy, monitor, and improve data pipelines for operational environments.
  • Implement event streaming pipelines using Apache Kafka, RabbitMQ, or ZeroMQ.
  • Collaborate with analytics, engineering, and mission teams to ensure effective data integration and output quality.
  • Stay current with emerging trends in data engineering, distributed systems, and modern data architecture.
  • Document data processes, pipeline structures, and engineering best practices.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service