Data Engineer (Remote - US)

ICF International, IncReston, VA
47dRemote

About The Position

We are open to supporting 100% remote work anywhere within the United States. Must be able to support Eastern Time Zone ICF is a rapidly growing, entrepreneurial, multi-faceted consulting company, seeking a Data Engineer. The Data Engineer will help bring new data insights to a government agency committed to improving child welfare. The ICF team performs custom software development, analytics, and maintenance on a suite of web-based applications, and works closely with clients and other contractors to ensure the performance and reliability of public-facing, mission-critical applications. Based on your experiences and interests, we may ask you as a technology professional to support growth-related activities, including (but not limited to) RFI, RFP, prototypes, and oral presentations. Team members are also expected to uphold and maintain appropriate certifications necessary for their practice expertise.

Requirements

  • Bachelor's degree (e.g., Computer Science, Engineering or related discipline)
  • 6-8 years' experience in Data engineering with strong background in pipeline development and data integration.
  • 3+ years of hands-on experience with AWS data services, including: Amazon Glue, Lambda, S3, StepFunctions and Athena; familiarity with Redshift and Lake Formation is a plus.
  • 6+ years of experience in SQL and programming, preferably in Python.
  • Experience with BI Tools like Tableau, PowerBI or Amazon QuickSight.
  • Experience with cloud integration tools such as Talend, Informatica
  • Excellent oral communications, thought leadership and formal presentation skills
  • US Citizen or Permanent Lawful Resident (Green Card Holder).
  • Must be able to obtain and maintain a Public Trust
  • MUST RESIDE IN THE United States (U.S.) and the work MUST BE PERFORMED in the United States (U.S.), as this work is for a federal contract and laws do apply

Nice To Haves

  • Understand ETL concepts of data flow, data enrichment, data consolidation, change data capture and transformation
  • Familiarity with modern data lake architecture and data governance best practices.
  • Demonstrated experience showing strong critical thinking and problem solving skills paired with a desire to take initiative
  • Experience working with big data processing frameworks such as Apache Spark, and streaming platforms like Kafka or AWS Kinesis.
  • AWS certification (Data Analytics, Developer, or Solutions Architect) is a plus.
  • Experience with event-driven architectures and real-time data pipelines.
  • Experience with DevOps tools like Jenkins/Git to assist development process
  • Experience working in agile development environments

Responsibilities

  • Help build, and optimize a AWS-based data lake to support AI/ML initiatives and advance analytics
  • Design and implement scalable data ingestion pipelines for both batch and real-time data from diverse structured and unstructured sources
  • Perform extensive data profiling, transformation and enrichment to prepare clean, ML ready datasets for data scientists and analysts.
  • Develop custom reports and data visualizations to support analytics, decision-making across business and technical teams
  • Collaborate with data scientists and business teams to deliver curated datasets and reporting needs for ML and analytics.
  • Support project delivery on Data lake, Data Warehouse/BI projects for external and internal clients, including partnering with ICF subject matter experts on project execution

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Industry

Professional, Scientific, and Technical Services

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service