Data Engineering Analyst - Portfolio Analytics & Reporting

StepStone Group Inc.La Jolla, CA
37d$72,000 - $76,500Hybrid

About The Position

We are global private markets specialists delivering tailored investment solutions, advisory services, and impactful, data driven insights to the world's investors. Leveraging the power of our platform and our peerless intelligence across sectors, strategies, and geographies, we help identify the advantages and the answers our clients need to succeed. Position Overview: The Data Analyst will be a member of the Data Engineering team within StepStone's Portfolio Analytics and Reporting ("SPAR") department. This person will take primary ownership of a flagship client-facing pipeline that delivers daily data files: monitoring automated processes, validating outputs, troubleshooting issues, and implementing ongoing improvements. In addition, this person will build and maintain reliable, scalable data pipelines using Snowflake, dbt, Airflow, and Airbyte, helping to advance and optimize our modern data platform that supports downstream analytics and client reporting. This role is ideal for someone who enjoys combining analytical thinking with hands-on data engineering, learning from senior engineers and architects while owning critical operational responsibilities that directly impact our clients and business teams.

Requirements

  • Bachelor's degree in Computer Science, Information Systems, or a related quantitative field.
  • 1-3 years of experience in data analysis or a junior data engineering role.
  • Strong SQL skills; Experience working with Snowflake is preferred.
  • Scripting ability (Python or similar) for automation and troubleshooting.
  • Familiarity with modern data tools (Snowflake, dbt, Airflow, Airbyte) or a strong willingness to learn.
  • Ownership mindset, strong sense of responsibility, and attention to detail.
  • Solid communication skills; comfortable liaising with technical and non-technical stakeholders.
  • Ability to triage and solve problems under pressure.

Nice To Haves

  • Experience operating production pipelines (monitoring, SLAs, incident management).
  • Knowledge of version control (Git), CI/CD, or infrastructure as code.
  • Exposure to cloud platforms (AWS, GCP, Azure).
  • Experience with data testing frameworks (e.g. dbt tests, validation frameworks)
  • Familiarity with finance, private markets, or investment data domains.

Responsibilities

  • Own daily client-facing pipeline operations: Monitor, validate, and ensure timely and accurate file deliveries to clients via automated Airflow jobs.
  • Troubleshoot and resolve data or delivery issues, coordinating with internal and client stakeholders as needed.
  • Build and maintain monitoring, alerting, and validation checks to meet SLAs, data quality standards and enhance pipeline reliability.
  • Support development and maintenance of end-to-end ELT processes using Airbyte, dbt, Airflow, and Snowflake, with final data output following star-schemas and snowflake-schemas.
  • Implement data validation and reconciliation logic to ensure completeness and integrity across systems.
  • Contribute to automation and DevOps practices (CI/CD, code reviews, documentation, and runbooks).
  • Support other data initiatives such as master data management, data modeling, and ad-hoc analysis.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Entry Level

Industry

Securities, Commodity Contracts, and Other Financial Investments and Related Activities

Number of Employees

1,001-5,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service