Data Engineer

GuidehouseAberdeen, MD
Onsite

About The Position

Our consultants on the Defense & Security AI & Data team help clients maximize the value of their data and automate business processes. This high performing team works with clients to implement the full spectrum of data analytics and data science, from data querying and wrangling to data engineering, to data visualization, dashboarding, and business intelligence (BI), to predictive analytics, machine learning (ML), and artificial intelligence (AI). Our services enable our clients to define their information strategy, enable mission critical insights and data-driven decision making, reduce cost and complexity, increase trust, and improve operational effectiveness.

Requirements

  • An ACTIVE and CURRENT SECRET federal security clearance.
  • Bachelor's Degree.
  • FOUR (4) or more years of total experience in a data or IT discipline.
  • TWO (2) or more years of experience in data engineering.
  • Experience designing and developing databases with Structured Query Language (SQL) and performing complex queries.
  • Experience developing complex data pipelines and extract – transform – load (ETL) processes.
  • Experience moving and manipulating data of different types and file types using Python.
  • Experience working in cloud-based databases and architecture.
  • Demonstrated ability to perform data quality management and integration to ensure accurate and usable datasets using querying or data engineering software tools, such as SQL, Python, R, etc.
  • Experience using business intelligence tools Tableau and/or Power BI to visualize large data sets.
  • Experience developing and implementing a resiliency strategy, including data visualization tailored for law enforcement.
  • Experience developing transformative data integration solutions that transform law enforcement work, workforces, and workplaces through innovation, assets and analytics for success through evolution.

Nice To Haves

  • Master’s degree in computer science or related field.
  • Strong understanding of different types of data storage.
  • Experience working with system and application logs.
  • Proficiency in querying and joining acquisition, programming and transactional datasets.
  • Proficiency in programming languages such as Python.
  • Proficiency developing dashboards using Tableau or Power BI.
  • Familiarity with DevOps and container technologies (Docker/Kubernetes).
  • Ability to implement basic automation and CI/CD.
  • Experience with micro services architecture and API gateways.
  • Technical expertise with data models, data mining, data cleansing, and segmentation techniques.
  • Experience with source code version control technologies such as Git.
  • Ability to code and debug stored procedures.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills, with the ability to work effectively in a team environment.

Responsibilities

  • Conduct ETL/ELT and data quality analysis using various technologies (i.e., Python, Databricks, Palantir Foundry).
  • Manage ETL/ELT functions and develop, build, test, and maintain scalable data pipeline architectures and tools.
  • Develop integrated data pipelines and models to measure program performance and report on progress.
  • Work in cloud-based databases and ETL/ELT architecture.
  • Design and build distributed systems for scalability and security.
  • Work with data integration and management tools and databases.
  • Gather requirements, design, implement, and test database systems.
  • Coordinate with non-technical users to gather requirements.
  • Provide and prepare data to enable data science and machine learning.
  • Demonstrate strong understanding of relational databases, columnar data warehouses, data lakes, NoSQL, and other storage types.
  • Develop intuitive, attractive, and interactive data visualizations using large data sets to create dashboards, using tools such as Tableau and Power BI, for a diverse set of users with varying technical capabilities.
  • Request, document, report, and analyze data, and where applicable, create tools and resources to support data and reporting.
  • Develop and maintain data governance documentation, tools, and templates.
  • Work with Palantir Foundry and Databricks.
  • Maintain good working relationships with clients to enhance customer satisfaction and work with client management and staff to perform engagement services.
  • Ensure data governance and quality assurance standards are met.
  • Organize and lead client meetings, including scheduling meetings; drafting and delivering agendas and meeting minutes; providing and archiving required documentation; and documenting, tracking, and following up on action items.
  • Summarize and present information and reports to the team and make recommendations (both oral and written).

Benefits

  • Medical, Rx, Dental & Vision Insurance
  • Personal and Family Sick Time & Company Paid Holidays
  • Position may be eligible for a discretionary variable incentive bonus
  • Parental Leave and Adoption Assistance
  • 401(k) Retirement Plan
  • Basic Life & Supplemental Life
  • Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts
  • Short-Term & Long-Term Disability
  • Student Loan PayDown
  • Tuition Reimbursement, Personal Development & Learning Opportunities
  • Skills Development & Certifications
  • Employee Referral Program
  • Corporate Sponsored Events & Community Outreach
  • Emergency Back-Up Childcare Program
  • Mobility Stipend
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service