About the position
As a Senior Data Engineer on the Data team at MNTN, you will play a crucial role in building and managing the platform for generating, tracking, and analyzing key business and client success metrics. Your main objective will be to extract meaningful insights from raw data using SQL and other tools, and create efficient ETL/ELT workflows to transform and organize the data for reporting and performance tracking. Additionally, you will be responsible for visualizations, reporting, and alerting to illustrate performance trends and identify opportunities. This role requires strong experience in data engineering, SQL, data modeling, and familiarity with cloud computing environments and business intelligence tools.
Responsibilities
- Become the expert on the MNTN platforms, UI, data infrastructure, and data processes
- Extract meaningful business metrics from raw data using SQL and other tools
- Create and manage ETL/ELT workflows that transform raw data into accessible information across databases and data warehouses
- Organize data and metrics for measurable and trackable confidence in reporting and client performance
- Organize visualizations, reporting, and alerting to illustrate performance, data quality, trends, and opportunities
- Investigate critical incidents and ensure resolution by relevant parties
- Have 5+ years of experience related to data engineering, analysis, and modeling complex data
- Strong experience in SQL, data modeling, and manipulating and extracting large data sets
- Hands-on experience working with data warehouse technologies and designing ETL flows
- Experience with programming languages such as Python, Java, or shell scripting
- Familiarity with software processes and tools such as Git, CI/CD pipelines, Linux, and Airflow
- Experience with working in a cloud computing environment such as AWS, Azure, or GCP
- Familiarity with a business intelligence tool such as Domo or Looker
Requirements
- 5+ years of experience related to data engineering, analysis and modeling complex data
- Strong experience in SQL, data modeling, and manipulating and extracting large data sets
- Hands-on experience working with data warehouse technologies
- Familiarity with building data pipelines and architectures and designing ETL flows
- Experience with programming languages such as Python, Java, or shell scripting
- Familiarity with algorithms
- Familiarity with software processes and tools such as Git, CI/CD pipelines, Linux, and Airflow
- Experience with working in a cloud computing environment such as AWS, Azure, or GCP
- Familiarity in a business intelligence tool such as Domo, Looker
Benefits
- 100% remote work
- Open-ended vacation policy with an annual vacation allowance
- Three-day weekend every month of the year
- Competitive compensation
- 100% healthcare coverage
- 401k plan
- Flexible Spending Account (FSA) for dependent, medical, and dental care
- Access to coaching, therapy, and professional development