Data Transformation Engineer

ConfieHuntington Beach, CA
8h$110,000 - $120,000Hybrid

About The Position

Work under the guidance and supervision of the Director of Enterprise Architecture to build Confie's next-generation Enterprise Data Solutions. Require expertise in implementing and maintaining data solutions on Snowflake data cloud environments that drive critical business insights and operations. Responsible for developing robust data models, creating efficient ELT processes, and optimizing performance to support the organization's data needs

Requirements

  • 4+ years of professional experience in data engineering, designing and implementing data pipelines, and building data infrastructure
  • 4+ years of strong experience required in Snowflake data cloud and ETL development, including Snowflake procedures, udf's in Python and SQL, streams, tasks, Snowpipe, and working with semi-structured data
  • 3+ years of strong experience with Python programming and extensively used frameworks/packages like Snowpark, pandas, numpy, and requests for Data Analysis and integration
  • Solid understanding of data warehousing concepts, dimensional modeling, and data integration techniques
  • 2+ years of strong experience with data integration and transformation tools like Coalesce, WhereScape, and Azure Data Factory
  • Bachelor's degree in Computer Science, Engineering, or a related field

Nice To Haves

  • Experience with data quality and observability concepts is a plus
  • Snowflake Advanced Certification is a plus
  • Additional certifications related to data platforms are a plus

Responsibilities

  • Design and develop data pipelines, ELT workflows to populate the cloud Data Lake and Data Warehouse on Snowflake with transformation tools (e.g., Coalesce, WhereScape, Azure Data Factory) and replication tools (e.g., Fivetran, Airbyte, etc.)
  • Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components, and enhancements
  • Design and develop robust and scalable data pipelines to support data integrations using Snowflake, Coalesce, Python, Airflow, and Fivetran
  • Design and develop Snowflake data objects (tables, views, stored procedures, udf's, etc.)
  • Implement ELT (Extract, Load, Transform) processes using Snowflake's features, such as SnowPipe, Streams, and Tasks
  • Perform data cleaning, analysis, and integration using Python
  • Ability to work with multiple data sources and types (structured/semi-structured/unstructured)
  • Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs
  • Monitor and optimize query performance and resource utilization within Snowflake using query profilling, query optimization techniques, and workload management features
  • Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency
  • Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it
  • Monitor data pipelines for timely and accurate completion
  • Stay up-to-date with industry trends and advancements in data engineering, continuously improving the team's technical knowledge and skills
  • On-call support

Benefits

  • Generous PTO plans, sick pay and health benefits
  • Annual bonus based on employment standing
  • Work from home and hybrid model employment
  • Confie Enablement Fund/ Scholarship Program
  • I-Care Recognition Program
  • Corporate Social Responsibility Program
  • Diversity, Equity and Inclusion Initiatives
  • Confie Hub and Discount Programs (Gym Membership)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service