Data Analyst

Confiz
Onsite

About The Position

We are hiring a Data Analyst with 5+ years of experience using Python and SQL to transform complex data into actionable business insights. This role is ideal for someone who can lead technical teams through data source discovery while designing ETL/ELT pipelines using Apache Airflow, DBT, Snowflake, and Databricks and someone who can collaborate with stakeholders to define KPIs and communicate findings through dashboards built in Tableau, Power BI, or R-Shiny.

Requirements

  • Bachelor's degree in quantitative field (Statistics, Computer Science, Data Science, or related)
  • 5+ years of experience in data analysis
  • Proficiency in Python and SQL
  • Experience with Apache Airflow for workflow orchestration
  • Experience developing data flows, orchestration, and ELT processes (e.g. Airflow, DBT, Databricks, Snowflake)
  • Experience with relational databases (BigQuery, Redshift, Oracle, Teradata) and Snowflake
  • Proficiency with data visualization tools (Tableau, Power BI, or R-Shiny)
  • Experience with Git version control
  • Knowledge of statistical analysis and hypothesis testing

Responsibilities

  • Leads business and technical teams through the process of data source discovery, collection, and definition
  • Identifies required data sources, defines and catalogs them, and ensures accuracy by validating data aggregations, measures, and calculations
  • Design and develop data pipelines and ETL/ELT processes using Apache Airflow, DBT, Databricks, or Snowflake
  • Extract and analyze large datasets from relational databases (BigQuery, Redshift, Oracle, Teradata) and Snowflake using SQL
  • Perform data analysis using Python and SQL with advanced statistical methods
  • Create data visualizations using Tableau, Power BI, or R-Shiny
  • Conduct hypothesis testing and A/B testing for business insights
  • Collaborate with data engineers and business stakeholders to define data requirements, KPIs, and reporting standards
  • Maintain data quality frameworks by monitoring pipelines, identifying anomalies, and implementing data validation checks
  • Document data definitions, business rules, and analytical methodologies to support team knowledge sharing and governance
  • Use Git for version control and collaborative development
  • Optimize database queries and create views across multiple tables
  • Report analytical findings to business stakeholders
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service