About The Position

As a Cloud Data Engineer, you will guide Public Sector customers to develop, configure and deploy their data and AI solutions. Together with the team, you will support customer implementations of Google Cloud products through architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring, and more. You will consult with customers on how to best design their data and AI solutions including development and deployment of ML models, and integrations with leading Google technologies. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work closely with Product Management and Product Engineering to drive excellence in Google Cloud products and features. Google Public Sector brings the magic of Google to the mission of government and education with solutions purpose-built for enterprises. We focus on helping United States public sector institutions accelerate their digital transformations, and we continue to make significant investments and grow our team to meet the complex needs of local, state and federal government and educational institutions.

Requirements

  • Bachelor's degree in Computer Science or equivalent practical experience.
  • 3 years of experience with relational database technologies (e.g., PostgreSQL, MySQL, SQL Server) and writing SQL queries.
  • Experience in object-oriented programming using Python or Java, and data structures and algorithms.
  • Ability to travel up to 30% of the time as needed.
  • Active, or the ability to obtain, a TS/SCI security clearance.

Nice To Haves

  • Experience building and optimizing distributed data systems and machine learning operation pipelines.
  • Experience with database management tools for backups, recovery, snapshot management, sharding, partitioning and database performance tuning.
  • Experience in database administration techniques including storage, clustering, availability, disaster recovery, security, logging, performance tuning, monitoring and auditing.
  • Experience working with business stakeholders to understand requirements, provide technical leadership, and educate teams on Google Cloud Platform (GCP) best practices.
  • Familiarity with cloud-native data warehousing solutions like BigQuery, including AI-assisted SQL and Python features.
  • Ability to troubleshoot and resolve issues in large-scale data lakes containing structured, semi-structured, and unstructured data.

Responsibilities

  • Develop and maintain robust, scalable data pipelines using Python, Java and SQL, ensuring high performance and data integrity across multi-cloud environments.
  • Apply object-oriented design principles to create modular, self-sustaining code for automated data processing and system enhancements.
  • Work closely with cross-functional teams to translate business requirements into conceptual and logical data models.
  • Analyze on-premise and cloud database environments, consulting on the optimal design for performance and deployment on Google Cloud Platform. Design, build, and maintain data warehouse and pipeline solutions.
  • Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations, adapting to different levels of key business and technical stakeholders.

Benefits

  • bonus
  • equity
  • benefits
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service