Senior Architect

Reveal Global Consulting LLCFulton, MD
2hHybrid

About The Position

At a high level, the candidate must be hands on Data Engineer/Architect with ability to lead the work which includes working directly with client, understand their requirements, design & develop the solution. Communication and ability to collaborate effectively.

Requirements

  • Minimum 10 years of experience
  • Data Pipeline Development: Design, implement, and manage robust data pipelines using Python, PySpark, SQL to efficiently extract, transform, and load data from diverse sources (Batch & Streaming)
  • AWS Expertise: Demonstrate expertise in core AWS services such as AWS DMS, AWS Glue, AWS Step Functions, Amazon S3, Amazon Redshift, Amazon RDS, Amazon EMR, AWS IAM, AWS LAMBDA etc., and apply them to build scalable and reliable data solutions.
  • Data Modeling: Develop and maintain efficient data models to support analytical and reporting needs.
  • Database Management: Administer databases using AWS services like Amazon RDS or Amazon Redshift, focusing on schema design, performance optimization, and monitoring. Data Warehousing: Utilize Amazon Redshift or Amazon Snowflake to create high-performing analytical databases that empower data-driven decision-making. ETL Best Practices: Implement industry best practices for ETL processes, including data validation, error handling, and data quality checks.
  • Performance Optimization: Optimize query performance through continuous tuning of databases and leveraging AWS's scalability capabilities.
  • Monitoring and Logging: Establish robust monitoring and logging mechanisms using AWS CloudWatch, Amazon CloudTrail, or comparable tools to ensure pipeline reliability. Security and Compliance: Ensure adherence to security best practices and relevant compliance standards, tailoring solutions to meet GDPR, HIPAA, or other regulatory requirements.
  • Automation: Drive automation of deployment and scaling of data pipelines using infrastructure as code (IaC) tools like AWS CloudFormation and Terraform.
  • Collaboration: Collaborate closely with cross-functional teams, including data scientists, analysts, and other stakeholders, to understand their data needs and provide effective solutions.
  • Continuous Learning: Stay updated on the latest developments in AWS services and data engineering methodologies, applying new insights to enhance our data infrastructure.
  • Soft Skills: Exhibit strong communication skills to facilitate effective teamwork and interaction with diverse groups.

Responsibilities

  • Data Pipeline Development: Design, implement, and manage robust data pipelines using Python, PySpark, SQL to efficiently extract, transform, and load data from diverse sources (Batch & Streaming)
  • AWS Expertise: Demonstrate expertise in core AWS services such as AWS DMS, AWS Glue, AWS Step Functions, Amazon S3, Amazon Redshift, Amazon RDS, Amazon EMR, AWS IAM, AWS LAMBDA etc., and apply them to build scalable and reliable data solutions.
  • Data Modeling: Develop and maintain efficient data models to support analytical and reporting needs.
  • Database Management: Administer databases using AWS services like Amazon RDS or Amazon Redshift, focusing on schema design, performance optimization, and monitoring. Data Warehousing: Utilize Amazon Redshift or Amazon Snowflake to create high-performing analytical databases that empower data-driven decision-making. ETL Best Practices: Implement industry best practices for ETL processes, including data validation, error handling, and data quality checks.
  • Performance Optimization: Optimize query performance through continuous tuning of databases and leveraging AWS's scalability capabilities.
  • Monitoring and Logging: Establish robust monitoring and logging mechanisms using AWS CloudWatch, Amazon CloudTrail, or comparable tools to ensure pipeline reliability. Security and Compliance: Ensure adherence to security best practices and relevant compliance standards, tailoring solutions to meet GDPR, HIPAA, or other regulatory requirements.
  • Automation: Drive automation of deployment and scaling of data pipelines using infrastructure as code (IaC) tools like AWS CloudFormation and Terraform.
  • Collaboration: Collaborate closely with cross-functional teams, including data scientists, analysts, and other stakeholders, to understand their data needs and provide effective solutions.
  • Continuous Learning: Stay updated on the latest developments in AWS services and data engineering methodologies, applying new insights to enhance our data infrastructure.
  • Soft Skills: Exhibit strong communication skills to facilitate effective teamwork and interaction with diverse groups.

Benefits

  • Medical/Prescription Drug Plan
  • Dental Plan
  • Vision Plan
  • Basic Life Insurance
  • Accidental Death & Dismemberment Insurance
  • Short Term Disability Insurance
  • Long-Term Disability Insurance
  • 401K Retirement Plan
  • Flexible Spending
  • Employee Assistance Program (EAP)
  • Paid time off and Holidays
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service