About the position
We are seeking a passionate data engineer/analyst with at least 3 years of experience to handle all data management processes in our team and develop new approaches to data consumption and quality control. The ideal candidate will have strong SQL knowledge, experience with data manipulation in Python, and familiarity with data format structure. Additional skills that will help the candidate stand out include experience with modern cloud data solutions, writing DAGs and building data pipelines in Airflow, transforming, testing, and documenting data using dbt, and experience with Tableau. The job overview includes telemetry design for new game features, improvement of existing telemetry, creating and maintaining documentation regarding data, development and maintaining of a data quality and consistency control ecosystem, development of a notification system for changes in product KPI's, and data mart and dashboards development.
Responsibilities
- Telemetry design for new game features.
- Improvement of existing telemetry: enrichment with new parameters, exclusion of unused ones, search for bugs and inconsistencies with the basic logic.
- Creating and maintaining documentation regarding data: description of telemetry and processes on it, data infrastructure usage guidelines, etc.
- Development and maintaining of a data quality and consistency control ecosystem: definition of logic, its implementation in code, and building of an alerting system.
- Development of a notification system for changes in product KPI's (implementation through Slack bot).
- Data mart and dashboards development.
Requirements
- 3+ years of experience as a data engineer/analyst.
- Strong SQL knowledge.
- Familiar with version control systems (GIT).
- Experience with data manipulation in Python.
- Familiar with data format structure (CSV, XML, JSON).
- Experience in data modeling.
- Understanding the principles of database design and operation.
- Experience with modern cloud data solutions (Snowflake, Google BigQuery).
- Experience in writing DAGs and building data pipelines in Airflow.
- Experience in transforming, testing, and documenting data using dbt.
- Experience with Tableau.