Palo Alto Networks-posted 3 months ago
$145,000 - $235,500/Yr
Full-time • Senior
Santa Clara, CA
5,001-10,000 employees

Our Data & Analytics group is responsible for working with various business owners/stakeholders from Sales, Marketing, People, GCS, Infosec, Operations, and Finance to solve complex business problems which will have a direct impact on the metrics defined to showcase the progress of Palo Alto Networks. We leverage the latest technologies from the Cloud & Big Data ecosystem to improve business outcomes and create through prototyping, Proof-of-Concept projects and application development. We are looking for a Senior Staff IT Data Engineer with extensive experience in Data engineering, SQL, Cloud engineering and business intelligence (BI) tools. The ideal candidate will be responsible for designing, implementing, and maintaining scalable data transformations and analytical solutions that support our business objectives. This role requires a strong understanding of data engineering principles, as well as the ability to collaborate with cross-functional teams to deliver high-quality data solutions. This is an in office role in our HQ (Santa Clara, CA).

  • Design, develop, and maintain data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse or data lake environment.
  • Proactively identify and implement GenAI-driven solutions to achieve measurable improvements in the reliability and performance of data pipelines or to optimize key processes like data quality validation and root cause analysis for data issues.
  • Collaborate with stakeholders to gather requirements and translate business needs into technical solutions.
  • Optimize and tune existing data pipelines for performance, reliability, and scalability.
  • Implement data quality and governance processes to ensure data accuracy, consistency, and compliance with regulatory standards.
  • Work closely with the BI team to design and develop dashboards, reports, and analytical tools that provide actionable insights to stakeholders.
  • Mentor junior members of the team and provide guidance on best practices for data engineering and BI development.
  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in data engineering, with a focus on building and maintaining data pipelines and analytical solutions.
  • Expertise in SQL programming and database management systems.
  • Hands-on experience with ETL tools and technologies (e.g. Apache Spark, Apache Airflow).
  • Familiarity with cloud platforms such as Google Cloud Platform (GCP), and experience with relevant services (e.g. GCP Dataflow, GCP DataProc, Big Query, Procedures, Cloud Composer etc.).
  • Experience with Big data tools like Spark, Kafka, etc.
  • Experience with object-oriented/object function scripting languages: Python/Scala, etc.
  • Experience working SFDC Data Objects (Opportunity, Quote, Accounts, Subscriptions, Entitlements) would be highly desired.
  • Strong analytical and problem-solving skills, with the ability to analyze complex data sets and derive actionable insights.
  • Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams.
  • Aptitude for proactively identifying and implementing GenAI-driven solutions to achieve measurable improvements in the reliability and performance of data pipelines.
  • Demonstrated readiness to leverage GenAI tools to enhance efficiency within the typical stages of the data engineering lifecycle, for example by generating complex SQL queries, creating initial Python/Spark script structures, or auto-generating pipeline documentation.
  • Experience with BI tools and visualization platforms (e.g. Tableau) is a plus.
  • FLEXBenefits wellbeing spending account with over 1,000 eligible items selected by employees.
  • Mental and financial health resources.
  • Personalized learning opportunities.
  • Restricted stock units and a bonus.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service