Technical Data Engineer - Level II

S & K Technologies, Inc.Warner Robins, GA
1dOnsite

About The Position

This position requires the successful candidate to be able to work on site full time.  Successful consideration is tied to your ability to live within commute distance to the work site.  The Technical Data Engineer - Level II plays a critical role in designing, developing, and maintaining scalable data pipelines and architectures that support the organization's data analytics and business intelligence initiatives. This position focuses on transforming raw data into actionable insights by ensuring data quality, integrity, and accessibility across various platforms. The engineer collaborates closely with data scientists, analysts, and IT teams to optimize data workflows and implement best practices for data management. They are responsible for troubleshooting data-related issues and continuously improving data processing efficiency to meet evolving business needs. Ultimately, this role ensures that the organization’s data infrastructure is robust, reliable, and capable of supporting strategic decision-making processes.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • 3+ years of experience in data engineering or a similar technical role.
  • Proficiency in SQL and experience with relational and non-relational databases.
  • Hands-on experience with data pipeline and workflow management tools such as Apache Airflow or similar.
  • Strong programming skills in languages such as Python, Java, or Scala.
  • Experience working with cloud platforms like AWS, Azure, or Google Cloud Platform.
  • Familiarity with data warehousing solutions and big data technologies such as Hadoop or Spark.

Nice To Haves

  • Master’s degree in a relevant technical field.
  • Experience with containerization and orchestration tools like Docker and Kubernetes.
  • Knowledge of machine learning pipelines and integration with data engineering workflows.
  • Certification in cloud technologies (e.g., AWS Certified Data Analytics, Google Professional Data Engineer).
  • Experience with real-time data processing frameworks such as Apache Kafka or Flink.

Responsibilities

  • Design, build, and maintain efficient, reusable, and reliable data pipelines and ETL processes.
  • Collaborate with cross-functional teams to understand data requirements and deliver scalable data solutions.
  • Monitor and troubleshoot data pipeline performance and data quality issues to ensure accuracy and reliability.
  • Implement data integration from multiple sources, ensuring data consistency and integrity.
  • Optimize data storage and retrieval processes to improve system performance and reduce latency.
  • Document data engineering processes, workflows, and system architecture for knowledge sharing and compliance.
  • Stay current with emerging data technologies and recommend improvements to existing data infrastructure.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service