About The Position

CACI is currently looking for a highly skilled and experienced Informatica Intelligence Cloud Services (IICS) Data Engineer with agile methodology experience to join our BEAGLE (Border Enforcement Applications for Government Leading-Edge Information Technology) Agile Solution Factory (ASF) Team supporting Customs and Border Protection (CBP) client located in Northern Virginia! Join this passionate team of industry-leading individuals supporting the best practices in Agile Software Development for the Department of Homeland Security (DHS). As a member of the BEAGLE ASF Team, you will support the men and women charged with safeguarding the American people and enhancing the Nation’s safety, security, and prosperity. CBP agents and officers are on the front lines, every day, protecting our national security by combining customs, immigration, border security, and agricultural protection into one coordinated and supportive activity. ASF programs thrive in a culture of innovation and are constantly seeking individuals who can bring creative ideas to solve complex problems, both technical and procedural at the team and portfolio levels. The ability to be adaptable and to work constructively with a technically diverse and geographically separated team is crucial. You should have worked with or have a strong interest in agile software development practices and delivering deployable software in short sprints.

Requirements

  • Must be a U.S. Citizen with the ability to pass CBP background investigation, criteria includes but is not limited to: 3 year check for felony convictions, 1 year check for illegal drug use, 1 year check for misconduct such as theft or fraud
  • 7+ years of professional experience working on complex data challenges in the areas of data architecture and engineering
  • 3-5 years of Informatica experience (at least 1 year in IICS implementation experience)
  • Experience with designing, developing, and maintaining data integration workflows and ETL/ELT processes using Informatica Intelligent Cloud Services (IICS) or equivalent tools (e.g., Azure Data Factory, AWS Glue).
  • Experience utilizing PySpark, Python, SQL, Kafka, and Databricks or equivalent big data technologies like Snowflake, Redshift, Google BigQuery, or Microsoft Azure Synapse Analytics for advanced data transformations, large-scale data processing, and optimizing performance on big data platforms.
  • Experience writing complex SQL queries, stored procedures, and scripts to manage and manipulate data within the data warehouse.
  • Experience developing and implementing Continuous Integration and Continuous Deployment (CI/CD) pipelines to automate the build, test, and deployment of data-related assets and code.
  • Experience automating ELT data pipelines using CI/CD tools and technologies.
  • Database skillset for AWS RDS concepts, and understanding of database principles used by tools such Databricks, Oracle, and Informatica ETL
  • Strong software development background using Agile or DevOps methods and deep familiarity with cloud-native technologies.

Nice To Haves

  • 5-10 years of DHS, DoD, or IC experience working in complex data environments, including the architecture and optimization of data schemas, terabyte-scale ETL, etc.
  • 5-10 years of experience applying a range of analytical techniques including statistical, geospatial, link, temporal, and predictive analysis, for DHS, DoD, or IC agencies.
  • 3-5 years of experience building and implementing artificial intelligence, neural networks, deep learning, or machine learning capabilities in software applications in a national security or academic environment.
  • Exposure to Continuous integration, Continuous Deployment (CI/CD) and DevOps processes and tools
  • Exposure to implementing or migrating to Cloud environments like Amazon Web Services (AWS) or Microsoft Azure.
  • Previous experience as an Enterprise-level Data Architect, Data Engineer, Data Scientist, or Data Analyst.
  • Ability to apply advanced principles, theories, and concepts, and contribute to the development of innovative principles and ideas.

Responsibilities

  • Modernize the data warehouse environment by migrating the Informatica Powercenter to Informatica Intelligent Cloud Services (IICS)
  • Optimize and troubleshoot data pipelines and warehouse performance to ensure efficient and reliable data processing.
  • Design, develop, and maintain data integration workflows and ETL/ELT processes
  • Apply your skills for advanced data transformations, large-scale data processing, and optimizing performance on big data platforms.
  • Develop and implement Continuous Integration and Continuous Deployment (CI/CD) pipelines to automate the build, test, and deployment of data-related assets and code.
  • Apply your skills in development languages such as SQL, Sparks, Python and to create or augment business and operational intelligence tools to detect trends, patterns, and non-obvious relationships in large, complex, and disparate data sets.
  • Promote Agile culture and functional decomposition
  • Assists in removing roadblocks/impediments for the various teams

Benefits

  • healthcare
  • wellness
  • financial
  • retirement
  • family support
  • continuing education
  • time off benefits

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service