Data Analyst Engineer

MAP InternationalBrunswick, GA

About The Position

We are seeking an experienced Data Analyst/Engineer to lead our data infrastructure modernization and analytics transformation initiatives. This strategic role will architect and implement scalable cloud-based solutions that transform data into actionable insights, driving organizational impact and supporting our growth objectives. As a technical leader, you will design enterprise data systems, champion best practices, and leverage advanced analytics including machine learning and AI to solve complex challenges in our humanitarian mission to deliver medicines globally.

Requirements

  • AWS Cloud Architecture: Expert-level experience designing and implementing solutions with AWS Athena, S3, Lambda, QuickSight, and additional services (Glue, EMR, SageMaker, Step Functions, CloudWatch)
  • Programming & Scripting: Advanced proficiency in Python for data engineering, ML/AI development, and automation; expert SQL skills including query optimization, stored procedures, and performance tuning
  • API Development: Proven experience architecting enterprise API solutions, microservices, and system integrations; familiarity with API gateway patterns and service mesh architecture
  • Data Modeling: Deep expertise in dimensional modeling, data vault, normalization theory, and database design patterns for both OLTP and OLAP systems
  • Big Data Technologies: Hands-on experience with distributed computing frameworks (Spark, Hadoop ecosystem) and modern data lake architectures
  • Machine Learning & AI: Production experience deploying ML models; proficiency with scikit-learn, TensorFlow, PyTorch, or similar; understanding of MLOps principles
  • DevOps & Infrastructure: Experience with CI/CD pipelines, Infrastructure as Code (Terraform, CloudFormation), Git workflows, and containerization (Docker, Kubernetes)
  • Data Integration: Expertise with ETL/ELT tools, data orchestration platforms (Airflow, AWS Step Functions), and integration platforms (Zapier, MuleSoft)
  • 7-10+ years of progressive experience in data engineering, analytics, or related technical roles
  • Demonstrated track record of architecting and delivering enterprise-scale data solutions
  • Proven experience leading technical initiatives from conception through production deployment
  • Experience working in cloud-native environments with emphasis on AWS ecosystem
  • History of delivering measurable business impact through data-driven solutions
  • Experience partnering with executive leadership and translating business needs into technical strategies
  • Bachelor’s degree in computer science, Data Science, Statistics, Information Systems, Engineering, or related technical field required
  • Strategic thinking with ability to align technical solutions to organizational objectives
  • Exceptional analytical and problem-solving skills with bias toward action
  • Outstanding communication skills; able to present complex technical concepts to both technical and non-technical audiences including board members and executive leadership
  • Proven ability to influence and drive change across organizational boundaries
  • Strong project management skills with ability to manage multiple competing priorities
  • Leadership and mentoring capabilities; able to guide and develop others
  • Deep commitment to data quality, security, and ethical use of information
  • Self-directed with entrepreneurial mindset; thrives in dynamic, mission-driven environments

Nice To Haves

  • AWS Certifications: Solutions Architect Professional, Data Analytics Specialty, Machine Learning Specialty, or DevOps Engineer Professional
  • Advanced AWS Experience: Hands-on with Glue, EMR, Redshift, SageMaker, EventBridge, SNS/SQS, CloudFormation/CDK
  • Enterprise Integration: Experience with Salesforce administration/development, WordPress/WooCommerce ecosystems, Smartsheet automation
  • Data Visualization: Proficiency with Tableau, Power BI, or similar enterprise BI platforms beyond Quick Sight
  • Programming Languages: Additional languages such as R, Java, Scala; experience with functional programming concepts
  • Pharmaceutical/Healthcare: Understanding of pharmaceutical supply chain, regulatory compliance (FDA, DSCSA), or healthcare data standards (HL7, FHIR)
  • Nonprofit Sector: Experience with nonprofit operations, donor management systems, grant reporting, or impact measurement
  • Security & Compliance: Knowledge of HIPAA, SOC 2, data privacy regulations (GDPR, CCPA)
  • Agile Methodologies: Experience with Scrum, Kanban, or other agile frameworks
  • Thought Leadership: Published articles, conference presentations, or contributions to open-source projects
  • Master's Degree: In Data Science, Computer Science, Statistics, or related field
  • Team Leadership: Experience managing or leading technical teams or major initiatives
  • Master’s degree in data science, Computer Science, Applied Statistics, or MBA with technical focus strongly preferred
  • Relevant professional certifications (AWS, Google Cloud, Azure, or specialized data/ML certifications) may substitute for advanced degree with equivalent experience
  • PharmD (Doctor of Pharmacy) combined with technical experience is highly valued - pharmaceutical domain expertise combined with data engineering skills provides exceptional advantage for understanding our medicine distribution operations, regulatory compliance requirements, and supply chain complexities

Responsibilities

  • Lead the design and implementation of enterprise data architecture supporting organizational strategy
  • Define and execute data strategy aligned with organizational objectives
  • Champion data governance, quality standards, and best practices across the organization
  • Evaluate and recommend emerging technologies to enhance data capabilities
  • Serve as technical advisor to senior leadership on data-related initiatives
  • Drive data literacy and analytics adoption across functional teams
  • Architect and implement enterprise-scale data solutions using AWS services including S3, Lambda, Athena, and related technologies
  • Design and optimize complex ETL/ELT pipelines handling multi-source data integration at scale
  • Establish automated data quality frameworks, monitoring systems, and alerting mechanisms
  • Lead cloud migration initiatives and infrastructure modernization projects
  • Implement Infrastructure as Code (IaC) using CloudFormation, Terraform, or similar tools
  • Design cost-optimization strategies for cloud infrastructure and data storage
  • Ensure data security, compliance, and disaster recovery capabilities
  • Design and deliver enterprise analytics solutions and executive dashboards using AWS QuickSight
  • Lead development of organizational KPIs and performance measurement frameworks
  • Conduct advanced statistical analysis to inform strategic decision-making
  • Partner with senior leadership to translate business challenges into data-driven solutions
  • Build self-service analytics capabilities enabling stakeholder independence
  • Create compelling data narratives and visualizations for board presentations and stakeholder reporting
  • Architect and implement enterprise API strategy for data access and system integration
  • Design RESTful APIs following industry best practices and security standards
  • Lead integration projects connecting internal systems (Salesforce, WordPress, WooCommerce) with external platforms
  • Establish API governance including versioning, documentation, and lifecycle management
  • Implement authentication protocols, rate limiting, and comprehensive error handling
  • Optimize API performance and scalability for high-volume transactional systems
  • Architect and deploy production-grade machine learning solutions using AWS SageMaker and related services
  • Lead end-to-end ML lifecycle from problem definition through model deployment and monitoring
  • Implement MLOps practices including automated retraining, A/B testing, and performance monitoring
  • Apply AI/ML to optimize operational efficiency, predict demand, and enhance decision-making
  • Evaluate emerging AI technologies and their applicability to organizational challenges
  • Establish ethical AI practices and model governance frameworks
  • Design and implement enterprise data models supporting analytical and operational workloads
  • Create and maintain dimensional models (star schema, snowflake) optimized for business intelligence
  • Establish data modeling standards, naming conventions, and documentation practices
  • Lead data normalization and denormalization strategies based on use case requirements
  • Design for scalability, handling growth from current volume
  • Implement advanced optimization techniques including partitioning, indexing, and query tuning
  • Mentor and guide junior data professionals and cross-functional team members
  • Lead technical design reviews and code review processes
  • Collaborate with IT leadership on technology roadmap and digital transformation initiatives
  • Partner with external vendors and consultants on complex technical implementations
  • Facilitate knowledge sharing through documentation, training, and best practice dissemination
  • Architect solutions for processing and analyzing large-scale datasets (multi-terabyte)
  • Implement distributed computing frameworks and parallel processing strategies
  • Design data lake architectures balancing performance, cost, and accessibility
  • Leverage big data technologies (Hadoop, Spark) when appropriate for workload requirements
  • Optimize storage and compute resources for cost-effective big data processing
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service