(Not Available)-posted about 1 year ago
Full-time
Remote • Louisville, KY

The Azure Databricks Developer role is designed for an experienced professional with a strong background in big data engineering and the Azure data ecosystem. This position focuses on leading the transition from Azure Synapse Analytics to Databricks, utilizing advanced programming skills in Python and SQL to develop scalable and secure data solutions. The developer will be responsible for creating a medallion architecture for data management and ensuring data security while collaborating with cross-functional teams to meet business requirements.

  • Migrate and optimize workloads from Azure Synapse Analytics to Databricks.
  • Develop and implement big data engineering systems using Python, SQL, and Databricks.
  • Create medallion architecture (bronze, silver, gold) for structured and unstructured data.
  • Automate workloads using tools like Databricks workflows, Prefect, or CI/CD pipelines.
  • Ensure data security through schema, table, and row-level security, as well as dynamic data masking.
  • Develop monitoring, logging, and alerting systems for production-grade data pipelines.
  • Work with credential management systems like Azure Key Vault or HashiCorp.
  • Collaborate with cross-functional teams to translate business requirements into technical solutions.
  • Analyze data and create impactful visualizations using Power BI.
  • Integrate solutions with additional cloud technologies such as ServiceNow, Splunk, Dynatrace, or Azure Log Analytics.
  • 7+ years of experience in big data engineering.
  • 5+ years' programming experience in Python and SQL at an expert level.
  • 5+ years' experience in the Azure data ecosystem, including Databricks and Azure Synapse Analytics.
  • 3+ years' experience designing big data systems with medallion architecture.
  • 2+ years' experience in automating workloads and ensuring data security.
  • 2+ years' experience with source/version control systems like GitHub or Azure DevOps.
  • Proven ability to independently manage big data engineering projects.
  • Proficiency in Power BI, including Star Schema, DAX, and performance optimization.
  • Experience with multiple cloud technologies like ServiceNow, Splunk, or Dynatrace.
  • Strong analytical and problem-solving skills for handling large and complex datasets.
  • Expertise in credential management and automated response systems.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service