BigQuery Developer / GCP Data Engineer

Tryton TC LLCWoonsocket, RI
45mRemote

About The Position

We are looking for a motivated and detail-oriented BigQuery Developer with hands-on experience in Google Cloud Platform to support and enhance our enterprise data warehouse and analytics solutions. The ideal candidate will have strong SQL , BigQuery and Python development experience, along with working knowledge of Dataflow, Cloud Composer, GitHub, and CI/CD practices. This role requires strong analytical skills, problem-solving ability, and effective communication to work with cross-functional teams.

Requirements

  • 3–5 years of experience in data engineering or data development.
  • Strong hands-on experience with BigQuery.
  • Strong SQL skills (joins, aggregations, window functions, performance tuning).
  • Experience with Google Cloud Platform (GCP).
  • Experience building batch pipelines using Python and Dataflow.
  • Experience with Cloud Composer (Airflow).
  • Working knowledge of GitHub and GitHub Actions.
  • Experience with enterprise job schedulers such as Tidal.
  • Understanding of data warehousing concepts.
  • Strong analytical and problem-solving skills.
  • Good verbal and written communication skills.

Responsibilities

  • BigQuery Development
  • Develop, maintain, and optimize BigQuery datasets, tables, views, procedures and queries.
  • Write efficient and scalable SQL for reporting and analytics.
  • Implement partitioning and clustering to improve query performance.
  • Support data warehouse design and data modeling activities.
  • Monitor query performance and optimize cost usage.
  • Troubleshoot and resolve data-related issues in BigQuery.
  • Support data validation and quality checks.
  • Data Pipeline Development
  • Develop and maintain batch pipelines using Python and Google Cloud Dataflow (Apache Beam).
  • Load, transform, and integrate data from various sources into BigQuery.
  • Work on ETL/ELT processes and ensure reliable data processing.
  • Assist in debugging and performance tuning of pipelines.
  • Workflow Orchestration
  • Develop and maintain workflows using Cloud Composer (Apache Airflow).
  • Integrate workflows with Tidal Job Scheduler for enterprise scheduling.
  • Monitor production jobs and support issue resolution.
  • Version Control & CI/CD
  • Use GitHub for source control and collaboration.
  • Contribute to CI/CD pipelines using GitHub Actions.
  • Follow best practices for code versioning and peer reviews.
  • Collaboration & Communication
  • Work closely with data analysts, business users, and technical teams.
  • Translate business requirements into efficient BigQuery solutions.
  • Document data flows, technical designs, and operational processes.
  • Provide production support as needed.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service