Data Engineer

Ardent Principles, Inc.
Onsite

About The Position

We are seeking a Data Engineer who thrives in a mission‑driven, environment. This role is ideal for someone who can transform complex, disparate data into reliable, scalable pipelines that power analytics, automation, and decision‑making across high‑impact programs. If you excel at building robust data architectures, ensuring data integrity, and collaborating with cross‑functional technical teams, you’ll find this work both challenging and deeply meaningful. "Ardent Principles" signifies our unwavering commitment to excellence, driven by a profound passion and a strict adherence to ethical values. We believe that happy employees make for happy clients. Our mission is to act as a bridge between satisfied clients and fulfilled employees, ensuring that your job and well-being are our top priorities because your satisfaction leads to the success of our clients. With a competitive salary and industry-leading benefits, Ardent Principles offers more than just a job - we offer a career path filled with growth and opportunities. Join us and let's shape the future together!

Requirements

  • Active TS/SCI with Full Scope Polygraph

Nice To Haves

  • Using cloud services, such as AWS, as well as cloud data technologies and architecture.
  • Using big data processing tools such as Apache Spark or Trino
  • Working with machine learning algorithms
  • Using container frameworks such as Docker or Kubernetes
  • Using data visualizations tools such as Tableau, Kibana or Apache Superset
  • Creating learning objectives and creating teaching curriculum in technical or scientific fields.

Responsibilities

  • Data engineering, to include designing and building data infrastructure, developing data pipelines, transforming/preparing data, ensuring data quality and security, and monitoring/optimizing systems.
  • Data management and integration, including designing and operating robust data layers for application development across local and cloud or web data sources.
  • Programming with Python
  • Building scalable ETL and ELT workflows for reporting and analytics.
  • Using general Linux computing and advanced bash scripting
  • Using SQL.
  • Constructing complex multi-data source queries with database technologies such as PostgreSQL, MySQL, Neo4J or RDS.
  • Processing data sources containing structured or unstructured data.
  • Developing data pipelines with NiFi to bring data into a central environment
  • Delivering results to stakeholders through written documentation and oral briefings
  • Using code repositories such as Git
  • Using Elastic and Kibana technologies
  • Working with multiple stakeholders
  • Documenting such artifacts as code, Python packages and methodologies
  • Using Jupyter Notebooks
  • Machine learning techniques including natural language processing
  • Explaining complex technical issues to more junior data scientists, in graphical, verbal, or written formats
  • Developing tested, reusable and reproducible work
  • Work or educational background in one or more of the following areas: mathematics, statistics, hard sciences (e.g. Physics, Computational Biology, Astronomy, Neuroscience, etc.) computer science, data science, or business analytics.

Benefits

  • Highly Competitive Salary
  • Generous Paid Time Off
  • Dedicated Training Budget
  • 100% Employer-Covered Family Vision, Dental, and Health Insurance
  • 100% Employer-Covered Life and Disability Insurance
  • 401(k) Plan with a 6% Employer Match
  • 11 Paid Government Holidays
  • Spot Bonuses for Exceptional Performance
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service