Lead Big Data Engineer - PySpark

Logic20/20Seattle, WA
3d

About The Position

As a Lead Developer joining our Advanced Analytics practice, you'll be responsible for delivering client value and ensuring high client satisfaction. You'll be expected to be adept at recognizing, subscribing, and applying best practices, methodologies, tools, and techniques to meet client requirements, timelines, and budgets. Role ResponsibilitiesPerform data assessment, analysis, and data model design.Develop CI/CD pipelines and other DataOps fundamentals.Communicate at the appropriate level to both business and technology teams to understand business needs and pain points.Be creative in meeting the client's core needs with their technology.Explain technical benefits and deficits to non-technical audiences.About the Team The Logic20/20 Advanced Analytics team is where skilled professionals in data engineering, data science, and visual analytics join forces to build simple solutions for complex data problems. We make it look like magic, but for us, it's all in a day's work. As part of our team, you'll collaborate on projects that help clients spin their data into a high-performance asset, all while enjoying the company of kindred spirits who are as committed to your success as you are. And when you're ready to level up in your career, you'll have access to the training, the project opportunities, and the mentorship to get you where you want to go. “We build an environment where we really operate as one team, building up each other's careers and capabilities.” - Adam Cornille, Senior Director, Advanced Analytics

Requirements

  • 5-10+ years of data engineering experience using Python and Spark (PySpark).
  • 2+ years of experience leading the design and development of cloud ETL and data pipelines.
  • Experience working with fast streaming data.
  • Experience with Flink, Kafka, and Kubernetes.
  • Experience developing CI/CD pipelines.
  • Experience with Big Data Technologies (Hadoop, Spark/PySpark, MongoDB).
  • Experience working with Azure, AWS, and/or GCP.
  • Demonstrated ability to identify business and technical impacts of user requirements and incorporate them into the project schedule.
  • Ability to travel at least once per quarter.

Nice To Haves

  • Extensive experience working with Palantir Foundry.
  • Java experience is highly preferred.
  • Consulting experience and/or experience working within the Utilities industry is highly preferred.

Responsibilities

  • Perform data assessment, analysis, and data model design.
  • Develop CI/CD pipelines and other DataOps fundamentals.
  • Communicate at the appropriate level to both business and technology teams to understand business needs and pain points.
  • Be creative in meeting the client's core needs with their technology.
  • Explain technical benefits and deficits to non-technical audiences.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

251-500 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service