DATA SOLUTION ARCHITECT

CARE USAAtlanta, GA

About The Position

CARE is looking for a Cloud solution architect to be part of our Digital team. This person will design, develop, maintain and support our Enterprise Data Warehouse & BI platform within Care. This position has the ability to directly affect lives by leveraging the power of data. Work in an agile environment to analyze, design, develop and deliver Enterprise Data Warehouse solutions using Cloud technologies and Big data platform. Create ETL pipelines using Python, Azure data factory, Airflow. Work on creating data pipelines to create and maintain Data lake on AWS or Azure Cloud or equivalent cloud platform. Work with systems that handle sensitive data with regulatory compliance and change management processes, including data masking, row and column level data security implementations. Develop collaborative relationships with key sponsors and IT resources for the efficient resolution of work requests. Analyze raw data sources and data transformation requirements for structured, semi-structured and unstructured data. Experience working with REST API and micro services based data integrations to complete data ingestion from varied data sources, open source platforms. Write code, and leverage tools, to transform data to incorporate business logic as defined in conjunction with various stakeholders. Develop, enforce, and recommend enhancements to applications in the area of standards, methodologies, compliance, and quality assurance practices; participate in design and code walkthrough. Work with team members through all phases of the Systems Development Life Cycle (SDLC) using Agile practices. Develop and design solutions to support the data analytics and reporting needs, with data warehouse / No SQL and Power BI. Work with real time data streaming and processing using Open source technologies like Kafka, Spark etc. Good communication skills to present technical solutions, architecture and standards supporting the Digital teams data policies and best practices.

Requirements

  • Cloud solution architect experience
  • Experience with Enterprise Data Warehouse & BI platform
  • Experience in an agile environment
  • Experience with Cloud technologies and Big data platform
  • Experience creating ETL pipelines using Python, Azure data factory, Airflow
  • Experience creating data pipelines for Data lake on AWS or Azure Cloud or equivalent cloud platform
  • Experience working with systems that handle sensitive data with regulatory compliance and change management processes, including data masking, row and column level data security implementations.
  • Experience developing collaborative relationships with key sponsors and IT resources
  • Experience analyzing raw data sources and data transformation requirements for structured, semi-structured and unstructured data.
  • Experience working with REST API and micro services based data integrations
  • Experience writing code and leveraging tools to transform data
  • Experience developing, enforcing, and recommending enhancements to applications in the area of standards, methodologies, compliance, and quality assurance practices
  • Experience participating in design and code walkthroughs
  • Experience working with team members through all phases of the Systems Development Life Cycle (SDLC) using Agile practices.
  • Experience developing and designing solutions to support data analytics and reporting needs, with data warehouse / No SQL and Power BI
  • Experience working with real time data streaming and processing using Open source technologies like Kafka, Spark etc.
  • Good communication skills to present technical solutions, architecture and standards supporting data policies and best practices.

Responsibilities

  • Develop various data architecture solutions to ingest, transform, load and model data warehouse using cloud and big data technologies.
  • Develop and conduct proof of concept using cloud technologies to rapidly develop solutions supporting various data science initiatives.
  • Develop a reusable framework to ingest data from various data sources for varied patterns to create pipelines using Python or Spark scripting languages, to transform and store data in a Big data platform.
  • Analyze raw data sources and data transformation requirements and apply AWS or Azure technologies.
  • Data modeling and warehouse design.
  • Lead the effort to derive insight and data mining by doing data analysis of different datasets.
  • Follow advanced modeling techniques to create data relationship diagrams and enhance model to support future needs.
  • Conduct workshops and lead sessions working with internal teams, research specialists, operational teams to develop data architecture solutions supporting the desired functionalities.
  • Identify root cause and recommend solutions based on data analysis of problem statement and work on designing and implementing solutions based on priorities.
  • Work on developing conceptual, logical and physical models reflecting the different subject areas.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Senior

Education Level

No Education Listed

Number of Employees

101-250 employees

© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service