Wavicle Data Solutions-posted about 2 months ago
Full-time • Mid Level
Remote • Chicago, IL
501-1,000 employees

Wavicle Data Solutions is hiring a Databricks Solution Architect, who will be responsible for leading design and implementation of scalable and optimized solutions that leverage the latest Databricks for features. This individual will work closely with customers, understanding their needs and business drivers, and helping them adopt and optimize Databricks for their analytics, data science, and AI/ML workloads. They will provide thought and technical leadership, ensure best practices, and align customer strategies with Databricks’ offerings. They will also be part of a team helping the company identify and build point of views for the market and determine our Databricks Goto-Market strategy.

  • Solution Design: Develop data architecture solutions and reference architectures for customers using Databricks.
  • Customer Engagement: Work closely with clients to understand their business goals and technical needs, ensuring optimal use of the Databricks platform.
  • Pre-Sales Support: Provide technical expertise during the pre-sales process, conducting workshops and proof-of-concept (POC) projects.
  • Technical Leadership: Lead complex projects involving Databricks integration with cloud environments such as AWS, Azure, or GCP.
  • Performance Optimization: Help customers optimize performance for large-scale data processing, streaming, and machine learning workflows on Databricks.
  • Training and Mentorship: Guide customers and internal teams on best practices in data engineering, machine learning, and AI using Databricks.
  • Collaboration: Work cross-functionally with engineering, product, and sales teams to deliver successful implementations.
  • Market Strategy: Help the company identify market needs, define a Go To Market strategy, and execute on that strategy
  • Experience: 10+ years of experience in data engineering, data architecture, or related fields. Hands-on experience with Databricks is a must.
  • Cloud Experience: Proficiency in cloud platforms (AWS, Azure, GCP), with strong knowledge of cloud-native technologies and services.
  • Programming Skills: Expertise in Python, Scala, and SQL for large-scale data processing.
  • Data Engineering: Experience in designing and implementing data pipelines, ETL processes, Dev/Ops, Security/Ops and data lakes.
  • Machine Learning: Familiarity with ML workflows and libraries such as TensorFlow, PyTorch, and scikit-learn.
  • Big Data Tools: Strong experience with big data tools like Apache Spark, Hadoop, and Delta Lake.
  • Communication: Excellent communication and interpersonal skills to effectively engage with both technical and non-technical stakeholders.
  • Certifications: Databricks certifications or cloud certifications (AWS Certified Solutions Architect, Microsoft Azure Solutions Architect Expert, etc.).
  • Experience with Data Governance: Knowledge of data governance, security practices, and compliance in the cloud environment.
  • Problem-solving mindset
  • Strong customer-facing skills
  • Ability to engage with C-level, especially CTO and CDO
  • Strategic thinking with a focus on scalable and efficient architectures
  • Ability to create content for Databricks market and sales support
  • Ability to work independently and in team settings
  • Health Care Plan (Medical, Dental & Vision)
  • Retirement Plan (401k, IRA)
  • Life Insurance (Basic, Voluntary & AD&D)
  • Unlimited Paid Time Off (Vacation, Sick & Public Holidays)
  • Short Term & Long Term Disability
  • Employee Assistance Program
  • Training & Development
  • Work From Home
  • Bonus Program
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service