Senior Data Engineer – Analytics & BI Platforms

IntelliTech LLCAlexandria, VA
Remote

About The Position

IntelliTech is seeking a Senior Data Engineer to support mission-critical programs focused on data integration, analytics enablement, and operational decision support in secure federal environments. In this role, you will work closely with engineers, analysts, and mission stakeholders to develop scalable data pipelines, integrate enterprise data sources, and enable reporting and visualization workflows across modern analytics and BI platforms. The ideal candidate brings strong hands-on experience with Python, SQL, data pipeline development, and enterprise analytics platforms such as Databricks, Palantir, Tableau, Power BI, Qlik, or similar tools. This role is designed for an engineer who can support the full lifecycle of data solutions—from ingestion and transformation through validation, analytics enablement, visualization support, and sustainment.

Requirements

  • Active DoD Secret clearance.
  • Bachelor’s degree in Computer Science, Data Science, Engineering, Information Systems, or a related technical discipline and 4+ years of relevant experience; or Master’s degree in a related field and 2+ years of relevant experience.
  • Strong experience building or supporting data pipelines, ETL/ELT workflows, and data integration solutions in enterprise environments.
  • Hands-on experience with Python and SQL.
  • Experience with one or more modern analytics, BI, or data platform tools such as Databricks, Palantir, Tableau, Power BI, Qlik, or similar technologies.
  • Experience supporting reporting, analytics, dashboarding, or data visualization use cases.
  • Familiarity with cloud-based data environments and enterprise data workflows.
  • Familiarity with Git-based version control and collaborative development workflows.
  • Strong problem-solving and troubleshooting skills.
  • Excellent communication and collaboration skills, with the ability to work effectively across cross-functional teams.

Nice To Haves

  • Experience supporting mission-oriented analytics, logistics, operational, financial, or other large-scale federal data environments.
  • Experience with Spark, PySpark, Databricks notebooks, semantic models, BI dashboards, or enterprise reporting pipelines.
  • Experience with Palantir platform components, including Foundry, ontology concepts, or data workflows.
  • Familiarity with DevSecOps best practices, CI/CD workflows, and modern software delivery methodologies.
  • Experience working in Agile environments supporting enterprise software or analytics platforms.
  • Exposure to data governance, metadata management, lineage, or enterprise data architecture concepts.
  • Experience working with APIs, S3/object storage, or cloud-native data services.
  • Top Secret clearance or eligibility is a plus.

Responsibilities

  • Design, develop, and maintain scalable data pipelines to ingest, transform, validate, and operationalize data from a variety of enterprise sources.
  • Build and support data integration workflows across structured and unstructured data sources, including APIs, files, cloud storage, databases, and enterprise platforms.
  • Enable analytics and reporting use cases by preparing curated datasets and supporting integration with BI and analytics tools such as Tableau, Databricks, Palantir, Power BI, Qlik, or similar platforms.
  • Support the development of data models, transformation logic, and reusable engineering patterns that improve data access, decision support, and operational visibility.
  • Collaborate with analysts, engineers, and stakeholders to refine requirements and deliver data products aligned to mission needs.
  • Support testing, debugging, validation, and performance tuning of data pipelines and analytics workflows.
  • Leverage Git-based repositories and version control best practices to manage engineering efforts.
  • Participate in Agile development activities, including sprint planning, backlog refinement, demos, and retrospectives.
  • Document technical designs, data flows, transformation logic, and engineering processes to support maintainability and knowledge transfer.
  • Contribute to secure, scalable, and repeatable delivery practices across cloud and DevSecOps environments.

Benefits

  • health, dental, and vision insurance
  • a 401(k)
  • paid time off
  • professional development opportunities
  • flexible work arrangements
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service