Senior Data Engineer

Athenix Special Missions, LLCNC
62d

About The Position

Athenix Special Missions is seeking a Senior Data Engineer in Fort Bragg, North Carolina! ASM Quality Policy: To meet or exceed our customers’ expectations for quality, delivery, and service through continual improvement, striving to meet our objectives, and committing to meeting all legal and statutory requirements. We are seeking a highly skilled and versatile Data Engineer to join our team and play a critical role in designing, building, and optimizing our data infrastructure. This position combines elements of data analysis, engineering, and architecture to ensure the availability, reliability, and accessibility of data for analytics and decision-making. The ideal candidate will have a strong technical background, problem-solving skills, and a passion for working with data, while also collaborating with cross-functional teams to deliver innovative data solutions.

Requirements

  • Active TS/SCI
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field (or equivalent experience) OR CSSLP, CISSP-ISSAP
  • Strong programming skills in languages such as Python, Java, or Scala.
  • Proficiency in data analysis tools and programming languages (e.g., SQL, R).
  • Experience with data processing frameworks (e.g., Apache Spark, Hadoop) and orchestration tools (e.g., Airflow).
  • Familiarity with cloud-based data solutions (e.g., AWS Redshift, Google BigQuery, Azure Data Factory).
  • Proven experience in data modeling, database design, and data architecture frameworks.
  • Strong analytical and problem-solving skills, with attention to detail.
  • Excellent communication and collaboration skills to work effectively in cross-functional teams.

Nice To Haves

  • Master’s degree in a related field or professional certifications (e.g., AWS Certified Data Analytics, DAMA Certified Data Management Professional).
  • Familiarity with big data tools and platforms (e.g., Hadoop, Spark, Kafka).
  • Knowledge of real-time data streaming tools (e.g., Apache Kafka, Flink).
  • Experience with containerization technologies (e.g., Docker, Kubernetes).
  • Understanding of machine learning pipelines and data science workflows.
  • Knowledge of enterprise architecture frameworks (e.g., TOGAF).

Responsibilities

  • Build and maintain scalable, reliable, and efficient data pipelines to collect, process, and store data from various sources.
  • Design and implement ETL (Extract, Transform, Load) processes to prepare data for analysis and reporting.
  • Develop and maintain data models, schemas, and standards to support analytics and operational needs.
  • Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver solutions.
  • Analyze large datasets to identify trends, patterns, and actionable insights.
  • Present findings and recommendations to stakeholders using dashboards, reports, and visualizations.
  • Optimize database and data pipeline performance, ensuring scalability and reliability for large datasets.
  • Monitor and troubleshoot data pipeline issues, minimizing downtime and ensuring system reliability.
  • Implement data quality checks and validation processes to ensure data integrity.
  • Implement security measures to safeguard data and systems against unauthorized access and breaches.
  • Ensure compliance with data governance policies, security standards, and regulatory requirements (e.g., GDPR, HIPAA).
  • Establish best practices for data management, security, and governance to protect sensitive information.
  • Stay updated on industry trends and emerging technologies to enhance analytical and architectural capabilities.
  • Identify opportunities to improve data collection, processing, and analysis workflows.
  • Evaluate and recommend data management tools, technologies, and platforms to meet organizational goals.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service