Data Engineer (Snowflake & DBT)

Mariner Wealth AdvisorsOverland Park, KS
286d$80,000 - $120,000

About The Position

We are seeking candidates for a full-time data engineer position for our Overland Park, KS headquarters. We are open to candidates that are either hybrid or remote. The candidate will join a team developing data pipelines, building curated enterprise data products, and constructing system integrations for Mariner's ecosystem of platforms. This position requires excellent communication skills to collaborate across multiple business units, supporting data-driven decision-making as part of our collaborative team, you will integrate a diverse ecosystem of CRMs, portfolio accounting, trading, and ERP systems, crafting a seamless and interconnected suite of tools for Mariner. We are searching for positive, data-obsessed engineers with a passion for the industry and a capacity to leverage a variety of technical disciplines to support Mariner's growing portfolio of businesses.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 3+ years of professional experience in software development or data engineering preferred
  • Significant experience with SQL and Python

Nice To Haves

  • Relational databases and cloud data warehouses (SQL, Snowflake, Databricks, Redshift)
  • Data mastering and pipeline development (DBT, Python/Pandas, Spark, Streams/Tasks)
  • Building back-end services and processes in Python, Java, C#, or similar
  • Common data formats and data exchanges (JSON, CSV, JDBC, FTP, S3)
  • Experience with scripting languages (PowerShell, Bash/Sh, etc)
  • Cloud environments (AWS, Azure) and tools (lambda, ECS, S3)
  • DevOps and CI/CD practices (Docker, GitHub Actions) and IaC (Terraform)
  • Automation platforms (ActiveBatch, Airflow, Dagster, Windmill)
  • Enterprise systems such as Salesforce, ERPs, portfolio management platforms, order-management systems, and reporting tools (Tableau/Streamlit)
  • Prior experience in the financial services, investments, trading, or related industries

Responsibilities

  • Developing enterprise data platforms, data extraction, and mastering pipelines leveraging modern ETL/ELT practices
  • Systems integrations using Python, APIs, orchestration tools, and scripting languages
  • Building and maintaining enterprise datasets using Snowflake, SQL, and DBT
  • Architecture and asset deployments using IaC and CI/CD tools
  • Developing and maintaining automation configurations and scheduling
  • Building tools and processes for data validation, testing, alerting, and monitoring

Benefits

  • Progressive opportunities for professional growth
  • Supportive and diverse culture
  • Innovative workplace fostering camaraderie and teamwork
  • Work-life balance

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Industry

Credit Intermediation and Related Activities

Education Level

Bachelor's degree

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service