Data Architect - Data, Analytics & AI (Quality)

Eli Lilly and CompanyIndianapolis, IN
1d

About The Position

At Lilly, we unite caring with discovery to make life better for people around the world. We are a global healthcare leader headquartered in Indianapolis, Indiana. Our employees around the world work to discover and bring life-changing medicines to those who need them, improve the understanding and management of disease, and give back to our communities through philanthropy and volunteerism. We give our best effort to our work, and we put people first. We’re looking for people who are determined to make life better for people around the world. At Lilly, we unite caring with discovery to make life better for people around the world. We are a global healthcare leader headquartered in Indianapolis, Indiana. Our employees around the world work to discover and bring life-changing medicines to those who need them, improve the understanding and management of disease, and give back to our communities through philanthropy and volunteerism. We give our best effort to our work, and we put people first. We’re looking for people who are determined to make life better for people around the world. Data Engineering Lead Lilly’s Purpose At Lilly, we unite caring with discovery to make life better for people around the world. We are a global healthcare leader headquartered in Indianapolis, Indiana. Our 39,000 employees around the world work to discover and bring life-changing medicines to those who need them, improve the understanding and management of disease, and give back to our communities through philanthropy and volunteerism. We give our best effort to our work, and we put people first. We’re looking for people who are determined to make life better for people around the world. Come leverage the power of data to enable the highest standards of pharmaceutical quality! The Global Data Analytics & AI Organization (DIA) is actively seeking a Data Architect for Data, Analytics & AI (Quality) to lead the digital transformation of our global Quality function. This role is critical to enabling data-driven quality decisions, AI-enabled compliance monitoring, and shaping our data product strategy for quality assurance, control, and inspections. You will lead a team of analytics professionals, data scientists, and engineers to build trusted and intelligent quality data products and solutions. These will drive real-time insights, inspection readiness, and operational excellence while ensuring they meet stringent GxP, data integrity, and compliance standards. This is a strategic role, pivotal to delivering high-value analytics and AI capabilities that support quality systems, inspection outcomes, compliance assurance, and patient safety. What You’ll Be Doing You will be responsible for defining, designing, building, and maintaining the data ingestion and integration vision, strategy, and principles for the Data, Analytics & AI (Quality). You will be primarily responsible for developing automated data pipelines. You will be a key contributor to the Integration and Data Workstream with a focus on data publication strategy.

Requirements

  • Demonstrated communication, leadership, teamwork, project delivery, and problem solving skills.
  • Experienced in architectural processes (e.g. blueprinting, reference architecture, governance, etc.).
  • Understanding of external data standards (e.g. HL7, CDISC, SDTM) and external data vocabularies (e.g. MedDRA, RxNorm, SNOMED).
  • Skills in data modeling, data warehousing, data integration, data governance and an understanding of data security, data standards, and cloud architecture principles.
  • Demonstrated ability to influence IT and business strategies to drive large-scale outcomes.
  • Validated skills of strong learning agility and relationship building to influence change using knowledge and relationships.
  • Successful record of high quality, user focused, on-time & budget IT service and project delivery.
  • Experience with formal project management methodologies, agile frameworks (including Scrum, Kanban, SAFe, etc. ) and working knowledge of associated practices and tools.
  • Excellent analytical, problem solving and communication skills, working across agile and diverse teams.
  • A high level of intellectual curiosity, external perspective, and innovation interest.
  • Experience designing large scale data models for functional, operational, and analytical environments (Conceptual, Logical, Physical & Dimensional).
  • Experience with data modelling tools like Erwin Data Modeler, ER/Studio, Lucidchart etc.
  • Experience in several of the following disciplines: statistical methods, data modeling, data administration, ontology development, semantic graph construction and linked data, relational schema design.
  • Demonstrated SQL/PLSQL and data modeling proficiency.
  • Experience in AWS, Azure techstack and other cloud technologies.
  • Experience with security models and development on large data sets.
  • Experience working with a variety of relational and non-relational databases.
  • Experience with multiple database solutions (e.g. Postgres, Redshift, Aurora, Athena) and formal database designs (3NF, Dimensional Models).
  • Experience with Agile Development, CI/CD, Jenkins, Github, Automation platforms.
  • Demonstrated ability to analyze large, complex data domains and craft practical solutions for subsequent data exploitation via analytics.
  • Demonstrated ability to communicate with a geographically dispersed group of business and technical colleagues.
  • Ability to review and provide practical recommendations on design patterns, performance considerations & optimization, database versions, and database deployment strategies.
  • Knowledgeable in data functions such as Data Governance, Master Data Management, Business Intelligence.
  • Experience in moving on premises solutions into cloud and knowledge in multiple data ingestion patterns.
  • Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or related field; advanced degree preferred.
  • 10–12+ years of experience in data, analytics, or AI/ML roles
  • Experience working in regulated environments and with internal systems quality policies and procedures.

Nice To Haves

  • Knowledge of GxP, Pharmaceutical manufacturing processes and automations systems.

Responsibilities

  • Partner with Lilly architects, software vendor, and third-party implementation partner to develop and execute on a technical strategy for Quality data structure and data products.
  • Work with business to identify future uses for Lilly data and anticipated business results and enable processes to support these needs.
  • Develop and provide expertise in enterprise data domains; this includes data relationships, data quality, understanding of business data needs and the associated technology toolsets and methodologies.
  • Leadership and Strategy: Lead and mentor a team of data engineers, fostering a culture of collaboration, innovation, and excellence. Define and drive the data engineering strategy, aligning with business objectives and technical requirements. Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and translate them into actionable engineering solutions.
  • Data Pipeline Development: Design, develop, and maintain scalable and efficient data pipelines to support data analytics, reporting, and machine learning initiatives. Implement best practices for data ingestion, integration, transformation, and storage, ensuring data quality, reliability, and accessibility. Automate data pipeline processes to improve efficiency and reduce manual intervention, leveraging tools and frameworks such as Apache Airflow, Apache Kafka, and AWS Glue.
  • Data Ingestion and Integration: Lead the development of data ingestion and integration processes, sourcing data from various internal and external sources. Collaborate with stakeholders to define data ingestion requirements and implement solutions for real-time and batch data integration. Ensure seamless data flow between systems and applications, optimizing data transfer and transformation processes for performance and scalability.
  • Technical Expertise: Stay abreast of emerging technologies and trends in data engineering, continuously evaluating and adopting new tools and techniques to enhance our data infrastructure. Provide technical leadership and guidance on data engineering best practices, coding standards, and performance optimization techniques. Hands-on involvement in data engineering tasks, including coding, debugging, and troubleshooting complex data pipeline issues.
  • Quality Assurance and Governance: Establish and enforce data engineering standards, policies, and procedures to ensure data quality, consistency, and compliance. Implement monitoring and alerting mechanisms to proactively identify and address data pipeline issues, ensuring minimal disruption to business operations. Collaborate with data governance and security teams to enforce data privacy and security measures across the data lifecycle.
  • Persistent Pod : o Lead support and enhancement of solutions from persistent pod standpoint o Create Operations Metrics, define parameters for data effectiveness, measure data drift and drive operational stability of models o Work with persistent pod including vendors to resolve outages, issues etc. o Work closely with manufacturing and quality leadership teams to update and align on operational needs and quality standards.
  • Cross-Functional Collaboration: o Collaborate effectively with cross-functional teams, including data scientists, engineers, analysts, and business stakeholders, to deliver integrated solutions that meet business requirements. o Serve as a trusted advisor to senior leadership, providing insights and recommendations on technology trends, industry best practices, and strategic opportunities in data and AI. o Mentor junior resources, providing guidance and support in technical skill development and project execution.
  • Vendor and Partner Engagement: o Collaborate with vendors and partners to leverage external expertise and technologies, ensuring alignment with organizational goals and standards. o Evaluate and select appropriate vendors and partners to support data and AI initiatives, fostering strong relationships and driving successful outcomes.
  • Develop understanding how the enterprise data strategies, platforms, and technologies enable business execution. Use this knowledge to appropriately challenge status quo and drive business and Tech atLilly collaborations.
  • Continuously scan the technology environment for new trends and techniques; interface with external consultants and thought leaders as needed.
  • Synthesize new scalable approaches for Lilly data integration; derive the supporting arguments, communicate, and articulate the reasons for the new approaches to fellow engineers as well as senior management.
  • Possess a deep working knowledge of how data is or will be used and implications on people, processes, capabilities, and technologies including analytics and data integrations.
  • Work closely with data producers, information management, data engineers, AI/ML engineers and data consumers.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service