Technical Architect

TATA Consulting ServicesMalvern, PA
39d$100,000 - $130,000

About The Position

Databricks Architect with AWS expertise is responsible for designing, implementing, and optimizing scalable, secure, and high-performance data and analytics solutions leveraging the Databricks Lakehouse Platform on Amazon Web Services (AWS). This role involves a deep understanding of cloud architecture, data engineering, and analytics to deliver robust and efficient data solutions.

Requirements

  • Proven experience in designing and implementing data solutions on Databricks with a strong focus on AWS.
  • Expertise in AWS cloud services related to data, analytics, and infrastructure.
  • Proficiency in Spark (PySpark, Spark SQL) and Delta Lake.
  • Experience with Databricks Unity Catalog and Medallion architecture.
  • Strong understanding of data warehousing, data lakes, and ETL/ELT processes.
  • Familiarity with CI/CD pipelines and version control systems (e.g., Git).
  • Excellent communication and interpersonal skills.

Responsibilities

  • Design and develop end-to-end data architectures on Databricks within the AWS ecosystem, including data ingestion, storage, processing, and consumption layers.
  • Define architectural standards, best practices, and governance policies for Databricks on AWS deployments.
  • Lead the design and implementation of the Lakehouse architecture using Delta Lake, Unity Catalog, and other Databricks features.
  • Develop migration strategies for moving on-premises data workloads to Databricks on AWS.
  • Integrate Databricks with various AWS services such as S3, EC2, Lambda, Glue, Step Functions, Kinesis, and Redshift.
  • Optimize AWS resource utilization for Databricks clusters and related services, ensuring cost-efficiency and performance.
  • Design and implement secure data solutions on AWS, leveraging services like AWS IAM, VPC, and KMS.
  • Automate infrastructure provisioning and management using Infrastructure as Code (IaC) tools like AWS CloudFormation or Terraform.
  • Design and build robust data pipelines for ingestion, transformation, and analytics using Databricks (PySpark, Spark SQL) and AWS-native services.
  • Lead data modeling, schema design, performance optimization, and data governance best practices within the Databricks Lakehouse.
  • Develop and optimize Databricks SQL queries and dashboards for business intelligence and analytics.
  • Implement and manage Databricks Unity Catalog for unified data governance, access control, and data lineage.
  • Collaborate with data engineers, data scientists, and business stakeholders to translate business requireme nts into technical solutions.

Benefits

  • Discretionary Annual Incentive.
  • Comprehensive Medical Coverage: Medical & Health, Dental & Vision, Disability Planning & I nsurance, Pet Insurance Plans.
  • Family Support: Maternal & Parental Leaves.
  • Insurance Options: Auto & Home Insurance, Identity Theft Protection.
  • Convenience & Professional Growth: Commuter Benefits & Certification & Training Reimbursement.
  • Time Off: Vacation, Time Off, Sick Leave & Holidays.
  • Legal & Financial Assistance: Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Industry

Professional, Scientific, and Technical Services

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service