About The Position

At GFiber, we believe that great internet has the power to drive innovation, strengthen communities, enable the impossible, and do all the everyday things that make all of our world go round. And the job of creating better internet is never done - so we’re growing! Our team is committed to building a place where people who want to make a difference can grow their careers and find their spot to belong. GFiber is an Alphabet company that brings Google Fiber and Google Fiber Webpass internet services to homes and businesses across the United States. Our teams are expanding as we connect more cities and people to exceptional internet. The application window will be open until at least March 27th, 2026 . This opportunity will remain online based on business needs which may be before or after the specified date. This role is not eligible for immigration sponsorship. GFiber’s mission is to deliver abundant internet on networks that are always fast and always open with products that are easy to understand and clearly priced. We believe customers deserve a better internet experience and everything we do is focused on providing just that. On our team, you’ll work in an environment that’s redefining the status quo in the Internet industry. Within the Data Community team, you’ll do everything from data engineering, scripting, visuals, insights, data science, and more. Role Description (and additional job description) As a Finance Analytics Engineer for GFiber, you will be working with multiple stakeholders across the organization to provide quantitative support, define metric rules and logic, deliver useful tables and reports, model business scenarios, and solve a wide range of business problems. To do so, you will be expected to create new pipelines from scratch, merging broad financial and ERP data sets together into a cohesive technical infrastructure. You will be responsible for engineering foundational table resources using GCP: Bigquery, LookML, etc. The pipelines, metrics, tools, and analyses you develop will be used by our leaders to make strategic decisions and serve as a key input in decision-making.

Requirements

  • Bachelor's degree or equivalent practical experience.
  • 5 years of experience in analytics, data science, and/or computer science engineering.
  • 5 Years of experience with using SQL, writing queries from scratch on a daily basis, data engineering, architecture, pipeline management, and ETL using Google Cloud Platform (GCP) tools.
  • Experience with SAP.

Nice To Haves

  • Experience with Coupa, Adyen, Zuora.
  • Experience in HTML, Python, or other coding languages.
  • Experience using data to identify opportunities for business improvement and defining/measuring the success of those initiatives.
  • Ability to use written and verbal communication to build relationships with cross-functional partners and conduct presentations for stakeholders, including executive leadership.
  • Ability to engage with PMO teams to prioritize, provide LOEs, manage bug queues and bandwidth.

Responsibilities

  • Create foundational pipelines and data extractions with large data sets using SQL and other technologies demonstrating comfort with software-engineering-level best practices and approaches to data pipelines and management.
  • Work with large, complex data sets to solve difficult analysis problems, assess the impact of initiatives, and create on-going tooling/resources to run the business.
  • Collaborate with cross-functional partners to understand business context for solutions, analysis, and tooling needed.
  • Communicate findings, recommendations, and metrics to your management, stakeholders, and non-technical audiences.
  • Be the single point of contact for all data, analysis, or solutioning questions.
  • Undertake specific tasks, bugs, and maintenance work.
  • Additionally, own complex projects, proving mastery through accuracy, timeliness, and volume of work, with minimal guidance.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service