Senior Data Engineer (Op AI)

PMATSan Diego, CA
1d$185,000 - $195,000Onsite

About The Position

PMAT is seeking a Senior Data Engineer to design, build, and operationalize advanced data pipelines and analytics supporting Naval and DoD mission challenges. This role requires deep experience across data engineering, distributed systems, event streaming, cloud data frameworks, and mission-focused analytics. The Senior Data Engineer will support complex data integration efforts, collaborate with cross-functional engineering teams, and lead data architecture improvements aligned with operational Navy needs.

Requirements

  • Strong programming skills in Python, G, Rust, Pandas, R, SQL, or related languages
  • 10 years of experience as a business analyst, data analyst, data scientist, data engineer, database administrator, geospatial analyst/engineer, machine learning engineer, or software engineer, or related field.
  • Willingness to travel
  • 100% On Site willingness
  • Ability to safely carry tools, equipment, and materials aboard ship, including ascending and descending shipboard ladders(stairwells) and navigating confined spaces while maintaining required points of contact. Tools and equipment will weigh no more than 50 lbs.
  • Ability to perform required work aboard Navy vessels and in shipboard environments, including navigating narrow passageways, ascending, and descending ladders (stairwells), working on elevated platforms, and operating in variable sea conditions.
  • Ability to perform activities on a reoccurring basis during shipboard operations or testing evolutions.
  • Ability to comply with Navy safety requirements and wear required personal protective equipment (PPE).
  • Ability to operate in a DDIL office environment
  • Reasonable accommodations may be provided to enable qualified individuals to meet these requirements and perform the essential functions of the position.
  • US Citizenship
  • No dual citizenship
  • Active DoD TS/SCI clearance required

Nice To Haves

  • Experience with large-scale data architecture across secure DoD or government environments.
  • Experience working with NAVWAR, NIWC Pacific, or naval C2/ISR programs.
  • Experience architecting data solutions across multi-domain or cross-domain systems.
  • Familiarity with MLOps practices or deploying analytics/ML-enabled pipelines in classified, cross-domain, or constrained environments
  • Experience with cloud-native data architecture and API design.
  • Programming experience in Go or Rust.
  • Proven experience designing, developing, and deploying complex data pipelines.
  • Experience working with multiple data formats (e.g., CSV, JSON, XML, Parquet, ORC).
  • Familiarity with event streaming technologies: (e.g. Kafka, AWS Kinesis, RabbitMQ, ZeroMQ).
  • Experience deploying, monitoring, and optimizing operational data pipelines.
  • Expertise in Elasticsearch, Redis, S3, PostgreSQL, or related datastores.
  • Experience with AWS data services (EFS, RDS, S3, SNS, SQS).
  • Experience with distributed computing: AWS Lambda, DASK, Spark.
  • Familiarity with AirByte, Airflow, dbt, Iceberg, Snowflake.
  • Experience managing, integrating, and retrieving GIS data (ArcGIS, PostGIS).
  • Understanding of cybersecurity principles as applied to data applications and operational environments (including DDIL constraints)
  • Strong analytical and problem-solving skills.
  • Excellent communication skills in a collaborative team environment.
  • Previous experience supporting government agencies or military organizations.
  • Equivalent years of relevant experience in lieu of degree.
  • Master of Science in Computer Science, Data Science, Geography, Math, Machine Learning, or Statistics.
  • Additional certifications in data engineering, cloud, geospatial, or cybersecurity are a plus if required by contract.

Responsibilities

  • Collaborate with cross-functional teams to understand and address Navy operational challenges using data pipelines and analytics.
  • Design, develop, and implement data pipelines and analytics for naval applications.
  • Perform exploratory data analysis, algorithm development, and testing.
  • Normalize and structure data to common standards for interoperability.
  • Work with multiple data formats, including CSV, JSON, XML, Parquet, and ORC.
  • Develop and deploy data pipelines and analytics in real-world operational environments.
  • Deploy, monitor, and optimize data pipelines to ensure high performance and reliability.
  • Implement event streaming pipelines using Apache Kafka, AWS Kinesis, RabbitMQ, or ZeroMQ.
  • Utilize distributed computing platforms such as AWS Lambda, Dask, or Spark.
  • Leverage cloud-native tools including AWS S3, RDS, EFS, SNS, and SQS.
  • Utilize data pipeline frameworks such as AirByte, Apache Airflow, dbt, Apache Iceberg, and Snowflake.
  • Work with GIS data using ArcGIS, PostGIS, and related tooling.
  • Implement containerized environments using Docker or Kubernetes.
  • Apply cybersecurity principles in the context of secure DoD data applications.
  • Communicate findings and engineering solutions effectively with technical and mission stakeholders.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service