Staff Engineer - Data

GEICOAustin, TX
3d$110,000 - $230,000

About The Position

At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities. Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose. When you join our company, we want you to feel valued, supported and proud to work here. That’s why we offer The GEICO Pledge: Great Company, Great Culture, Great Rewards and Great Careers. Position Summary GEICO is seeking an experienced Staff Engineer with a passion for building high-performance, low maintenance, zero-downtime data solutions. You will help drive our insurance business transformation as we transition from a traditional IT model to a tech organization with engineering excellence as its mission. Within the Data Analytics and Vertical Engineering team, you will develop state-of-the-art data pipelines, models, and reports, transforming vast datasets that reach up to multiple terabytes in size, while championing innovation, best practices, and continuous learning. Position Description As a Staff Engineer, you will work to provide an excellent user experience for our internal stakeholders across the organization and maintain the highest standards of data and analytics engineering. Our team thrives and succeeds in delivering high quality data solutions in a hyper-growth environment where priorities shift quickly. We're seeking a visionary engineer who combines broad and deep technical expertise with strong leadership skills. The ideal candidate excels in designing advanced data processing pipelines, dimensional data modeling, and report development.

Requirements

  • Advanced programming experience and big data experience within Python, SQL, dbt, Spark, Kafka, Trino, Git, Containerization (Docker and Kubernetes)
  • Advanced experience with Data Warehouses (Snowflake preferred), dimensional modeling, and analytics
  • Experience with Apache Iceberg for managing large-scale tabular data in data lakes
  • Demonstrable knowledge of business intelligence tools (Power BI and Apache Superset preferred)
  • Experience with orchestration tools such as Apache Airflow or similar technologies to automate and manage complex data pipelines
  • Experience architecting and designing new ETL and BI systems
  • Experience with supporting existing ETL and BI systems
  • Experience with CI/CD to ensure smooth and continuous integration and deployment of data solutions
  • Ability to balance the competing needs of multiple priorities and excel in a dynamic environment
  • Advanced understanding of DevOps concepts including Azure DevOps framework and tools
  • Knowledge of developer tooling across the data development life cycle (task management, source code, building, deployment, operations, real-time communication)
  • Understanding of microservices oriented architecture and REST APIs and GraphQL
  • Advanced understanding of data quality monitoring and automated testing
  • Strong problem-solving ability
  • 8+ years of professional data and/or analytics engineering, programming languages and developing with big data technologies
  • 5+ years of experience with data architecture and design
  • 5+ years of experience with AWS, GCP, Azure, or another cloud service
  • 4+ years of experience with ETL and/or BI tools
  • 4+ years of experience in open-source frameworks
  • 3+ years of experience in Big-data tools like Spark and Databricks
  • Bachelor’s degree in Computer Science, Information Systems, Data Science, Statistics, Data Analytics or equivalent education or work experience

Nice To Haves

  • Experience with front end development using React/JavaScript is a plus
  • Experience with Contact Center, Marketing, Product, Sales, Service, Customer, Associate, Billing, Agency, Claims, or Telematics data is preferred

Responsibilities

  • Lead technical execution at the team level, overseeing data modeling architecture and design to ensure schemas are aligned with Business, AI, and Product analytical requirements.
  • Scope, design, and build scalable, resilient distributed systems.
  • Develop data pipelines and transform data.
  • Leverage your passion for data exploration to produce high quality reports with tools such as Power BI, Apache Superset, and React, empowering outstanding business decisions.
  • Apply your technical expertise to shape product definitions and drive towards optimal solutions.
  • Lead design sessions and code reviews with peers to elevate the quality of engineering across the organization.
  • Spearhead new feature use and innovate within existing tooling.
  • Engage in cross-functional collaboration throughout the entire development lifecycle.
  • Manage data pipelines, ensuring consistent data availability.
  • Mentor other engineers.
  • Consistently share best practices and improve processes within and across teams.

Benefits

  • We offer compensation and benefits built to enhance your physical well-being, mental and emotional health and financial future.
  • Comprehensive Total Rewards program that offers personalized coverage tailor-made for you and your family’s overall well-being.
  • Financial benefits including market-competitive compensation; a 401K savings plan vested from day one that offers a 6% match; performance and recognition-based incentives; and tuition assistance.
  • Access to additional benefits like mental healthcare as well as fertility and adoption assistance.
  • Supports flexibility- We provide workplace flexibility as well as our GEICO Flex program, which offers the ability to work from anywhere in the US for up to four weeks per year.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service