Data Solutions Architect

Appex InnovationJefferson City, MO
16hOnsite

About The Position

The ideal candidate will have a deep understanding of Microsoft data services, including Azure Fabric, Azure Data Factory (ADF), Azure Synapse, and ETL/ELT processes. This role focuses on designing, developing, and maintaining cloud-based data pipelines and solutions to drive our analytics and business intelligence capabilities.

Requirements

  • Must have Previous experience in the banking sector.
  • Azure Fabric, Azure Data Factory (ADF), Azure Synapse
  • 4+ years of experience in designing and implementing data warehouse and analytics solutions (on-premise and cloud).
  • 3+ years of expertise in data warehousing concepts (ETL/ELT, data quality management, privacy/security, MDM) with hands-on experience using ADF, Data Factory, SSIS, and related tools.
  • 3+ years of experience with cloud data and cloud-native data lakes/warehouses. Microsoft Azure services (Fabric Lakehouse, ADF, Data Factory, Synapse, etc.).
  • 2+ years of experience in Python, Scala, or Java for use with distributed processing and analytics, such as Spark.
  • Familiarity with CI/CD practices and tools such as Azure DevOps, Git, or Jenkins.

Responsibilities

  • Provide technical leadership in modernizing legacy data ingestion, ETL/ELT, and databases to cloud technologies (AWS/Azure).
  • Demonstrate a self-driven, ownership mindset to navigate ambiguity, resolve constraints, and mitigate risks with minimal supervision.
  • Implement data access, classification, and security patterns that comply with regulatory standards (PII, locational data, contractual obligations, etc.).
  • Build strong relationships with technical teams through effective communication, presentation, and collaboration skills.
  • Collaborate with stakeholders, business analysts, and SMEs to translate business requirements into scalable solutions.
  • Integrate data from multiple sources into cloud-based architectures, collaborating with cross-functional teams.
  • Work closely with data scientists, analysts, and stakeholders to meet data requirements with high-quality solutions.
  • Function within a matrixed team environment, sharing responsibilities across various teams.
  • Perform data profiling and analysis on both structured and unstructured data.
  • Design and map ETL/ELT pipelines for new or modified data streams, ensuring integration into on-prem or cloud-based data storage.
  • Automate, validate and maintain ETL/ELT processes using technologies such as Databricks, ADF, SSIS, Spark, Python, and Scala.
  • Proactively identify design, scope, or development issues and provide recommendations for improvement.
  • Conduct unit, system, and integration testing for ETL/ELT solutions, ensuring defects are resolved.
  • Create detailed documentation for data processes, architectures, and workflows.
  • Monitor and optimize the performance of data pipelines and databases.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service