IDC Guest Post
The life sciences and the healthcare (LSH) industries have seen immense potential in what AI has to offer and are investing significantly in AI.[1] [NL2] Deep Pharma Intelligence has reported that the cumulative investments in AI-related drug development between 2014 and 2023 topped $60 billion. AI, and GenAI in particular, have caught the attention of these industries, with a fifth of the LSH industries conducting 7-10 internal and external facing production launches for GenAI use cases in the past year (IDC Survey: Future Enterprise Resiliency & Spending Survey Wave 4, April 2024). Roche has partnered with Silver Brain AI to develop IUCLID Assist, a GenAI tool to automate IUCLID6 submissions by chemical and pharmaceutical companies to the European Chemical Agency. Astra Zeneca, Merck, Amgen and Sanofi are leveraging GenAI for drug discovery.
While these industries rapidly adopted public AI solutions to quickly roll out their own AI solutions, the rapidly increasing series of cyber-attacks against the LSH industries have raised significant concerns. The chairman and CEO of a healthcare technology company has described these attacks on infrastructure and on personally identifiable information (PII) as ‘an act of war’. Case in point: 52% of the life sciences industry and 28% of the healthcare industry identify the ability to provide robust data security as the most important characteristic when selecting their GenAI software provider, IDC GenAI ARC Survey, August 2023.
In these highly regulated industries, compliance is critical. It is essential to reduce the risk of privacy breaches, and to comply with data protection regulations such as the EU’s General Data Protection Regulation (GDPR), the US’ Health Insurance Portability and Accountability Act (HIPAA), and the California Consumer Privacy Act (CCPA). Not surprisingly, auditing for compliance and PII detection were the two most important criteria for evaluating an AI platform for the LSH industries. (IDC Survey: Future Enterprise Resiliency & Spending Survey Wave 4, IDC, April 2024). As per IDC’s GenAI ARC Survey, August 2023, the two industries that would be the most impacted by the GenAI revolution would be life sciences, closely followed by healthcare. As per IDC’s Life Sciences GenAI survey, November 2023, in R&D, almost 90% of the industry is focusing its GenAI efforts on quality, risk and compliance, while half of the industry is focusing its GenAI efforts on drug discovery and clinical trial optimization.
These regulated industries that handle sensitive data need alternatives to scale AI in a secure and compliant manner. Private AI is the answer. IDC’s August 2023 GenAI ARC Survey (n = 599) indicates that 83% of information technology (IT) leaders believe that the use of GenAI models leveraging their own business data will give them a significant advantage over competitors. Private AI is a multi-faceted approach that ensures ensures an enterprise’s control over its data and models, using techniques such as data encryption, differential privacy, data anonymization and pseudonymization, secure model governance,secure multi-party computation, and the use of the private cloud or on-premises deployment. It ensures data privacy and security, compliance with national and global regulations, better control over data and models, and better customization and collaboration, as well.
Advantages of Private AI
- Ensuring compliance with data privacy and security regulations. GDPR requires companies to comply a the patient’s “right to be forgotten”, which means companies need to delete patient’ data if the patient withdraws consent to the storage and use of her data. When you use public AI, if a third party has your patient’s data and the patient requests that the data be removed, you may be unable to fulfill the request. This is where private AI can play a critical role. With private AI, your enterprise data remains yours – you are in-charge. Data encryption provides heightened data security. Differential privacy techniques, which involve adding noise to the data to protect individual identities while still allowing useful patterns to be detected by AI models, further enhance privacy.
- Protecting your intellectual property: Since you are using your own data to train your models, you can optimize your own models, without sharing your data with your AI provider, who would otherwise use it to optimize public models that are accessible to one and all. This becomes even more important when considering the intellectual property that the life sciences industry manages, in areas such as drug discovery, design of experiment (DoE), and the manufacturing and supply chain process for cell and gene therapies. “Your data is our currency” is not an acceptable model for the life sciences industry where data represents highly valuable intellectual property. Using controlled data sets and models can minimize hallucinations and risks to patients.
- Ensuring compliance with data sovereignty regulations: Countries across the globe are implementing data sovereignty regulations, requiring that protected health information (PHI) data that is collected or stored in a specific location should be subject to the laws of that location, applying significant penalties for non-compliance. Private AI allows you to control your data flows.
- Driving better collaboration: Access to a secure model gallery enables secure collaboration for data scientists and application developers across a portfolio of models. Since these models are scanned and verified prior to publishing them in the gallery, and are stored and controlled privately by enterprise IT, this serves as a robust and a secure model collaboration tool with strong role-based access controls, in turn accelerating innovation.
Investing in Private AI for Highly Data-Sensitive Initiatives will Accelerate Innovation and Improve Clinical Outcomes
Public AI will continue to be widely leveraged by the LSH industries for more transactional AI initiatives that drive operational efficiencies, accelerating the implementation of AI solutions. But one sees an increasing shift to the development of private AI models, which leverage enterprise data and develop solutions exclusively for the enterprise. In addition, when IDC conducted a survey commissioned by VMware on private AI, it was observed that one fourth of the LSH industries preferred to use private AI models, developed from scratch by their own organizations owing to the multiple advantages that it offered, including a wider variety of models tailored to industry-specific use cases, greater model transparency, higher model accuracy rates, as well as increased cost efficiencies. In fact, 34% of the LSH industries perceived that the cost of developing or deploying AI models in the public cloud was as expensive and 28% considered it to be more expensive than developing or deploying AI models on-premises (Source: VMware Private AI Survey, IDC, July 2024). These advantages have resulted in enterprises scaling the adoption of private AI solutions. Moderna, for example, has developed 750 GPTs, and hundreds of use cases have been creating positive value across its teams.
IDC believes that LHS industries will invest in the enterprise-wide adoption of private AI solutions [3] [NL4] to protect their intellectual property and ensure data privacy, even more so in areas such as precision medicine in life sciences and medical imaging in healthcare. With the Inflation Reduction Act (IRA), negotiated pricing is coming into play after 13 years for biologics, as against 9 years for small molecules. Hence, the focus of the life sciences industry is going to be on biologics. GenAI will transform the discovery of new biologics, and the landscape of innovative TechBios and their partnerships with big pharma is exploding. Private AI will be critical to protecting this highly sensitive intellectual property.
Ensuring your data privacy is imperative and private AI will pave the way.
link