Victaulic Company-posted about 1 month ago
Full-time • Mid Level
Hybrid • Easton, PA

We are seeking an innovative Snowflake Solutions Engineer to join our growing IT team and lead the design and implementation of advanced Snowflake-native applications and AI-powered data solutions. This role will focus on leveraging Snowflake's modern platform capabilities including Streamlit applications, Cortex AI services, and emerging technologies to deliver business value through cutting-edge data solutions. The ideal candidate will have deep expertise in Snowflake's ecosystem, data architecture patterns including data warehousing, data lakes, and open lakehouse architectures, and experience building user-facing data applications.

  • Snowflake Native Application Development (30%): Design and develop interactive data applications using Snowflake Streamlit for self-service analytics and operational workflows that enable business users to interact with data through intuitive interfaces Create reusable application frameworks and component libraries for rapid solution delivery Integrate Snowflake Native Apps and third-party marketplace applications to extend platform capabilities Develop custom UDFs and stored procedures to support advanced application logic and business rules
  • Data Architecture and Modern Platform Design (30%): Design and implement modern data architecture solutions spanning data warehousing, data lakes, and lakehouse patterns Implement and maintain medallion architecture (bronze-silver-gold) patterns for data quality and governance Evaluate and recommend architecture patterns for diverse use cases including structured analytics, semi-structured data processing, and AI/ML workloads Establish best practices for data organization, storage optimization, and query performance across different data architecture patterns
  • AI Support and Advanced Analytics Collaboration (15%): Support AI and data science teams with Snowflake platform capabilities and best practices Collaborate on implementing Snowflake Cortex AI features for business use cases Provide technical guidance on data access patterns and feature engineering for AI workloads Design data structures and access patterns optimized for ML model training and inference Participate in proof-of-concepts for AI capabilities and provide platform expertise
  • Security, Governance, and Technical Leadership (15%): Design and implement role-based access control (RBAC) hierarchies following least privilege principles Establish security best practices including network policies, authentication methods, data encryption, and row or column level security and masking. Implement object tagging strategies and tag-based policies for access control and governance Monitor and optimize application performance, query efficiency, and user experience Establish cost optimization strategies for compute resources and storage across different workload patterns Provide technical guidance on Snowflake capabilities, features, and roadmap to stakeholders Lead architectural discussions on solution design patterns and technology selection Create technical documentation, implementation guides, and best practice recommendations
  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, Data Science or related technical field
  • At least 2 years of recent hands-on experience with Snowflake platform including advanced features
  • Minimum 3 years of experience in data engineering or solutions architecture roles
  • 7-10 years of experience in Data Architecture/Engineering and/or BI in a multi-dimensional environment
  • Proven track record of developing data applications or analytical solutions for business users
  • Snowflake Expertise: Advanced knowledge of Snowflake architecture including data warehousing, data lakes, and emerging lakehouse features
  • Security and Governance: Deep understanding of RBAC, row-level security, data masking, and Snowflake security best practices
  • DevOps and CI/CD: Strong experience with GitHub, SnowDDL, automated deployment pipelines, and infrastructure as code
  • Application Development: Proficiency with Snowflake Streamlit for building interactive data applications
  • SQL Proficiency: Expert-level SQL skills with experience in complex analytical queries and optimization
  • Python Programming: Strong Python skills for Snowpark development, data processing, and application logic
  • Data Architecture: Deep understanding of data warehousing concepts, data lake patterns, and modern lakehouse architectures
  • Backup and Recovery: Experience with disaster recovery planning, backup automation, and data retention strategies
  • Certifications: Snowflake SnowPro Core, Advanced Architect, or Data Engineer certification
  • AI/ML Collaboration: Experience supporting data science teams and understanding ML workflow requirements
  • Development Frameworks: Experience with modern web frameworks, API development, and microservices
  • Cloud Platforms: Knowledge of AWS, Azure, or Google Cloud data services and integration patterns
  • Data Governance: Understanding of data cataloging, metadata management, and governance frameworks
  • DevOps Tools: Experience with GitHub Actions, Jenkins, GitLab CI/CD, or similar automation platforms
  • Infrastructure as Code: Proficiency with SnowDDL, Terraform, Schemachange, or other IaC tools for Snowflake
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service