Python Snowflake Data Engineer - ONSITE

NTT DATABuffalo, NY
1dOnsite

About The Position

The Technical Data Engineer will design, develop, and maintain enterprise-grade data solutions that support regulatory and risk reporting requirements. This includes analyzing centralized enterprise data in Snowflake, building Python-based microservices and Flask APIs for data movement, implementing business and computational rules, and constructing data models in PostgreSQL to support Enterprise Risk reporting and FED submissions using Power BI (PBI)

Requirements

  • 5+ years experience in technical leadership roles.
  • 5+ years experience analyzing and profiling Snowflake data, ensuring data quality, lineage understanding, and accurate translation of business/regulatory requirements into technical specifications.
  • 5+ years experience designing and engineering end‑to‑end ETL/ELT pipelines across Snowflake, ARK, and PostgreSQL, implementing complex transformation logic, business rules, and FED‑reporting computations.
  • 7+ years experience building and maintaining Python microservices and Flask‑based REST APIs for data orchestration, ingestion, and integration with downstream systems.
  • Experience with implementing authentication, authorization, error handling, and logging within APIs and services.
  • 7+ years experience developing optimized PostgreSQL schemas, stored procedures, and performance‑tuned queries to support analytics and regulatory reporting.
  • 5+ years experience engineering Power BI reporting layers, semantic models, DAX measures, and high‑performance datasets for enterprise risk dashboards and regulatory reporting.

Nice To Haves

  • Ability to ensure data accuracy, perform validation and reconciliation, and resolve complex reporting and refresh pipeline issues.
  • Experience leveraging AI tools (Copilot, Claude, Foundry) to accelerate development, automate code generation, enhance documentation, and support AI‑assisted analysis and testing.
  • Experience integrating CI/CD pipelines for automated build, test, and deployment.

Responsibilities

  • Analyze and profile enterprise data stored in Snowflake, understanding structures, lineage, and relationships.
  • Perform data validations, quality checks, and metadata reviews.
  • Translate business and regulatory reporting requirements into technical data specifications.
  • Strong SQL and data analysis skills.
  • Proficiency with PostgreSQL schema design, stored procedures, query optimization.
  • Expertise in ETL/ELT pipelines, data transformation, and rule-based data computation.
  • Hands-on experience building Python microservices (Flask preferred).
  • Build and maintain Python-based microservices to orchestrate and automate ELT/ETL workflows.
  • Develop Flask-based REST APIs to extract, transform, and deliver data between Snowflake, ARK databases, and downstream systems.
  • Implement complex business rules, transformation logic, data calculation formulas, and FED-reporting computations.
  • Optimize pipelines for scalability, resilience, and performance.
  • Design, implement, and deploy Python microservices to support data ingestion and enrichment.
  • Build secure Flask APIs to expose and consume data services for ARK and reporting systems.
  • Implement authentication, authorization, error handling, and logging within APIs and services.
  • Integrate CI/CD pipelines for automated build, test, and deployment.
  • Design and build PostgreSQL data models optimized for analytics and regulatory reporting.
  • Create schemas, tables, stored procedures, indexes, and reporting-optimized structures.
  • Support dashboards and reporting modules used for Enterprise Risk reporting to the FED.
  • Develop computation logic for enterprise risk metrics, aggregation layers, time-series calculations, and regulatory formulas.
  • Understand the business requirements and translate them into a reporting data model suitable for Power BI.
  • Write optimized PostgreSQL queries, views, or stored logic to build a complex dataset for efficient report consumption.
  • Build a clean semantic model in Power BI with proper relationships, hierarchies, and DAX measures.
  • Ensure data quality, validate business rules, and manage complex joins in the dataset.
  • Optimize report performance through query tuning, model simplification, and efficient PBI design patterns.
  • Coordinate data refresh scheduling, troubleshoot errors, and ensure the reports run from end‑to‑end.
  • Leverage AI tools (Claude, Microsoft Foundry, Copilot, etc.) to accelerate solution design, documentation, and code generation.
  • Use AI-assisted data analysis to explore datasets, identify patterns, and derive insights that support reporting needs.
  • Develop and refine prompts to ensure accurate outputs, and validate AI-generated content for quality, reliability, and compliance.
  • Integrate AI-assisted workflows into development processes (e.g., code reviews, testing, optimization, debugging).
  • Collaborate closely with architects, business analysts, QA teams, and risk domain SMEs.
  • Collaborate with Risk and Compliance teams to validate calculations and adhere to regulatory frameworks.
  • Lead technical discussions around pipeline design, system integration, API frameworks, and data modeling.
  • Support code reviews, peer collaboration, and best‑practice adoption across teams.
  • Experience with data governance, metadata management, and enterprise data quality frameworks.
  • Effective communication, documentation, and analytical skills.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service