Software Engineer

Ardent Principles, Inc.Chantilly, VA
Onsite

About The Position

We’re looking for a Senior Software Engineer who enjoys turning complex data challenges into clean, scalable solutions. In this role, you’ll design and maintain production‑grade data pipelines, build reliable API integrations, and transform messy, semi‑structured inputs into well‑organized datasets that teams can actually use. You’ll work across cloud platforms, engineer efficient SQL solutions, and apply strong Python skills to automate and optimize ETL/ELT workflows. Who We Are: We offer advanced services in data science, data engineering, software engineering, AI solutions, cybersecurity, staff augmentation, and IT program management. Passionate Integrity, Driven by Excellence Ardent Principles offers a competitive salary range and a comprehensive, industry‑leading benefits package designed to support long‑term stability and employee well‑being. We provide more than a position—we offer a workplace committed to excellence, integrity, and mission‑focused impact. Our mission is to act as a bridge between satisfied clients and fulfilled employees, ensuring that your job and well-being are our top priorities because your satisfaction leads to the success of our clients. Join us as we continue building the future of secure, high‑impact solutions.

Requirements

  • Active TS/SCI with Full Scope Polygraph

Nice To Haves

  • ServiceNow AP!s, data models, and integration patterns.
  • Network management or IT operations systems data extraction.
  • Forward Networks, NetIM, SolarWinds, or similar network management platforms.
  • Experience and knowledge of ITSM (Information Technology Service Management), ITOM (Information Technology Operations Management), and CMDB (Configuration Management Database) data structures and relationships.
  • API gateway platforms and API management tools.
  • Apache Spark, particularly PySpark, for distributed data processing.
  • DBT (data build tool) for transformation workflows.
  • Infrastructure-as-code tools such as Terraform or CloudFormation.
  • Implementing CI/CD (Continuous Integration / Continuous Delivery) pipelines for data engineering code.
  • Experience and knowledge of streaming data technologies such as Kafka, Kinesis, or similar platforms.
  • Data quality platforms such as Great Expectations, Soda, or Monte Carlo.
  • Implementing data observability and monitoring solutions.
  • Experience and knowledge of Data Vault or other advanced modeling methodologies.
  • Containerization (Docker) and orchestration (Kubernetes) for data workloads.
  • Reverse ETL and operational analytics patterns.
  • Data governance platforms and metadata management tools.
  • Multiple cloud platforms and multi-cloud architectures.
  • Mentoring or leading data engineering initiatives.

Responsibilities

  • Designing, building, and maintaining production data pipelines using orchestration tools such as Apache Airflow or similar.
  • SQL skills including complex queries, optimization, and performance tuning across multiple database platforms.
  • Integrating data from client's Saas platforms and operational systems via APIs, including handling authentication, pagination, and rate limiting.
  • Working with semi-structured data (JSON and XML) from API responses and transforming into structured datasets.
  • Developing robust API integrations with proper error handling and retry logic.
  • Working with systems that have limited documentation or vendor-specific data models.
  • Dimensional modeling and data warehouse design patterns.
  • Proficiency in Python for data engineering including working with data processing libraries.
  • Cloud data platforms such as AWS, Azure, or GCP, including data services and infrastructure.
  • Implementing ETL/ELT processes from diverse data sources.
  • Version control (Git) and software engineering best practices.
  • Strong problem-solving and troubleshooting skills for complex data pipeline issues.
  • Implementing data quality checks and validation frameworks.
  • Translating business requirements into technical data solutions.
  • Having a proven track record of delivering reliable, scalable data infrastructure.

Benefits

  • Highly Competitive Salary
  • Generous Paid Time Off
  • Dedicated Training Budget
  • 100% Employer-Covered Family Vision, Dental, and Health Insurance
  • 100% Employer-Covered Life and Disability Insurance
  • 401(k) Plan with a 6% Employer Match
  • 11 Paid Government Holidays
  • Spot Bonuses for Exceptional Performance

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Senior

Education Level

No Education Listed

Number of Employees

1-10 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service