Tag Archive | 21 CFR Part 11

Can “Clinical Data Integration on the Cloud” be a reality?

The story I am about to tell is almost 8 years old. I was managing software services delivery for a global pharmaceutical company from India. This was a very strategic account and the breadth of services covered diverse systems and geographies. It is very common that staff from the customer organization visit our delivery centers (offsite locations) to perform process audits, governance reviews and to meet people in their extended organizations.

During one such visit a senior executive noticed that two of my colleagues, sitting next to each other, supported their system (two different implementations of the same software) across two different geographies. They happened to have the name of the systems they support, pinned to a board at their desks. The executive wanted us to take a picture of the two cubicles and email to him. We were quite surprised at the request. Before moving on to speak to other people he asked a couple of questions and realized the guys were sharing each other’s experiences and leveraging the lessons learnt from one deployment for the other geography.  It turned out that this does not happen in their organization, in fact their internal teams hardly communicate as they are part of different business units and geographies.

The story demonstrates how these organizations could become siloes due to distributed, outsourced and localized teams. Information Integration has become the way of life to connect numerous silos that are created in the process. Clinical research is a complex world.  While the players are limited, depending on the size of the organization and the distributed nature of the teams (including third parties), information silos and with that the complexity of integration of data increases. The result is very long cycle times from data “Capture” to “Submission”.

Clinical Data Integration Challenges

The challenges in integrating the clinical data sources are many. I will try to highlight some of the key ones here:

  • Study Data is Unique: depending on the complexity of the protocol, the design of the study, the data collected varies. This makes it difficult to create a standardized integration of data coming in from multiple sources.
  • Semantic Context: while the data collected could be similar, unless the context is understood, it is very hard to integrate the data, meaningfully. Hence, the integration process becomes complex as the semantics become a major part of the integration process.
  • Regulations and Compliance: Given the risks associated with clinical research, it is assumed that every phase of the data life should be auditable. This makes it very difficult to manage some of the integrations as it may involve complex transformations along the way.
  • Disparate Systems: IT systems used by sponsors, CROs and other parties could be different. This calls for extensive integration exercise, leading to large projects and in turn huge budgets.
  • Diverse Systems: IT systems used at each phase of the clinical data life cycle are different. This makes sense as the systems are usually meant to fulfill a specific business need. Even the functional organizations within a business unit will be organized to focus on a specific area of expertise. More often than not, these systems could be a combination of home grown and commercial off the shelf products from multiple vendors. Hence, the complexity of integrations increases.

What is Integration on the Cloud?

As mentioned earlier, integration is a complex process. As the cloud adoption increases, the data may be distributed across Public, Private (Includes On-Premise applications) and Hybrid clouds. The primary objective of integration on the cloud is to provide a software-as-a-service on the cloud to integrate diverse systems. This follows the same pattern as any other cloud services and delivers similar set of benefits as other cloud offerings.

The “Integration on Cloud” vendors typically offer three types of services:

  1. Out-of-Box Integrations: The vendor has pre-built some point-to-point integrations between some of the most used enterprise software systems in the market (like ERPs, CRMS etc.)
  2. Do-it-Yourself: The users have the freedom to design, build and operate their own integration process and orchestrations. The service provider may provide some professional services to support the users during the process.
  3. Managed Services: the vendor provides end-to-end development and support services

From a system design and architecture perspective, the vendors typically provide a web application to define the integration touch points and orchestrate the workflow that mimics a typical Extract-Transform-Load (ETL) process. It will have all the necessary plumbing required to ensure that the process defined is successfully executed.

Who are the players?

I thought it would be useful to look at some of the early movers in this space. The following is a list (not exhaustive and in no particular order, of course) of “Integration on Cloud” providers:

  1. Dell Boomi : Atom Sphere
  2. Informatica : Informatica CLOUD
  3. IBM : Cast Iron Cloud Integration
  4. Jitterbit : Enterprise Cloud Edition

These vendors have specific solution and service offerings. Most of them provide some out-of-the-box point-to-point integration of enterprise applications like ERPs, CRMs etc. They also offer custom integrations to accomplish data migration, data synchronization, data replication etc. One key aspect to look for is “Standards based Integration”. I will explain why that is important from a clinical data integration perspective later. While this offering is still in its infancy, there are some customers that use these services and some that are in the process of setting up some more.

Clinical Data Integration on Cloud

Many of you dealing with Clinical Data Integration may be wondering as to “Why bother with Integration on the Cloud?” while we have enough troubles in finding a viable solution in a much simpler environment. I have been either trying to create solutions and services to meet this requirement or trying to sell partner solutions to meet this requirement for the past 4 years. I will confess that it has been a challenge, not just for me but for the customers too. There are many reasons like, need to streamline the Clinical Data Life Cycle, Data Management Processes, retiring existing systems, bringing in new systems, organizational change etc. Not to mention the cost associated with it.

So, why do we need integration on the cloud? I firmly believe that if a solution provides the features and benefits listed below, the customers will be more than willing to give it a strong consideration (“If you build it, they will come”). As with all useful ideas in the past, this too will be adopted. So, what are the features that would make Clinical Data Integration on the cloud palatable?  The following are a few, but key ones:

  1. Configurable: Uniqueness of the studies makes every new data set coming in from partners unique. The semantics is also one of the key to integration. Hence, a system that makes it easier to configure the integrations, for literally every study, will be required.
  2. Standards: The key to solving integration problems (across systems or organizations), is reliance on standards. The standards proposed, and widely accepted by the industry (by bodies like CDISC, HL7 etc.) will reduce the complexity. Hence, the messaging across the touch points for integration on the cloud should rely heavily on standards.
  3. Regulatory Compliance and GCP: As highlighted earlier, Clinical Research is a highly regulated environment. Hence, compliance with regulations like 21 CFR Part 11 as well as adherence to Good Clinical Practices is a mandatory requirement.
  4. Authentication and Information Security: This would be one of key concerns from all the parties involved. Any compromise on this would not only mean loss of billions of dollars but also adverse impact on patients that could potentially benefit from the product being developed. Even PII data could be compromised, which will not be unacceptable
  5. Cost: Given the economically lean period for the pharma industry due to patent expiries and macro-economic situation, this would be a key factor in the decision making process. While the cloud service will inherently convert CapEx to OpEx and thus makes it more predictable, there will be pressure to keep the costs low for add-on services like “new study data” integration.

Conclusion

All in all, I would say that it is possible, technically and economically and also a step in the right direction to overcome some existing challenges. Will it happen tomorrow or in the next 1 year? My answer would be NO. In 2 to 3 years, probably YES. The key to making it happen is to try it on the cloud rather than on-premise. Some of the vendors offering Integration on Cloud could be made partners and solve this age old problem.

Update on 03/27/2012:

This post has been picked up by “Applied Clinical Trials Online” Magazine and posted on their blog -> here

Risk based monitoring and SDV – Increased adoption of ECM

It is universal fact that the escalating costs of discovering new medicinal products are driving sponsors and CROs to scrutinize every dollar spent in the process. The situation is escalating fast as pressure mounts with the blockbusters of yester years come off patents. This means that there is a need for all the stakeholders in the value chain to revisit their approach to existing processes and come up with innovative ways to save costs.

Key Trends:

Risk Based Site Monitoring:

US FDA has also recognized this fact and come out with a “Guidance for Industry Oversight of Clinical Trial Investigations – A Risk Based Approach to Monitoring”. As per this guidance the objective of the guidance is “to assist sponsors of clinical investigations in developing risk-based monitoring strategies and plans for investigational studies of medical products, including human drug and biological products, medical devices, and combinations thereof”. As per the document its overarching goal is to “enhance human subject protection and the quality of clinical trial data”. However, it clarifies on the strict interpretation that the industry has assumed long back and has spent billions in monitoring the sites. While this is draft guidance it is more than likely that this will end up in establishing the guiding principles for central monitoring of clinical trials, as such is the intent.

Risk Based Source Document Verification:

Primary activity of site monitoring is the source document verification and as was the case in site monitoring, industry has been mostly using the strict interpretation of source document verification. This resulted in 100% of source documents (patient eCRFs) being verified by the site monitors. However, the industry has realized that “law of diminishing returns” applies to this process as well and has been reducing the percentage of documents verified by the monitors. Medidata’s Insights “Targeted Site Monitoring Trend Snapshot” published in Applied Clinical Trials website confirms this trend.  According to this report the SDV percentage has reduced from 85% in 2007 to 67% in 2010.

Virtual Clinical Trials:

Another key trend that is slowly evolving in this context is complete virtualization of clinical trials. Pfizer is taking the lead in this space. While Pfizer’s REMOTE program is ongoing, an interim feedback from the program is provided by Mr. Craig Lipset, Head of Clinical Innovation at Pfizer is provided by Applied Clinical Trials in their article titled “Pfizer’s REMOTE Virtual Experience”. As highlighted in the article, they are going through the roadblocks of an early adopter.

Adoption of Enterprise Collaboration and Content Management:

Electronic communication and collaboration:

The above trends indicate that it is just a matter of time that total virtualization of clinical trials is accomplished. The key question that needs to be addressed is “How will the human interactions be virtualized?” The answer is “to adopt electronic communication and collaboration channels”. The channels can range from adopting systems to capture and manage clinical data, electronic source document verification to seamless communication and collaboration tools. The unique constraint with respect to clinical trials though is to ensure that the tools used adhere to “Good Clinical Practices” as well as other regulatory requirements like 21 CFR Part 11, HIPAA etc.

Enterprise Collaboration and Content Management Systems:

Systems for Electronic data capture (EDC), Clinical Data Management (CDM), Clinical Trial Management System (CTM), Adverse Event Management etc. are already available in the market. The key tool that would make it easier to seamlessly transition from human interactions to virtual interactions, in my view, is an Enterprise Collaboration and Content Management tool. Tools like Microsoft’s SharePoint, as highlighted in one of my previous blog posts, will help organizations make this transition fast and cost-effective. While it is always easier said than done, from real world experiences that we already have, it is relatively easy to adopt these tools in a GxP environment, meet all the regulatory compliance requirements and also accomplish the degree of flexibility required to easily communicate and collaborate.

Social Media and Mobility:

A couple more initiatives that could complement in the process are Social Media and Mobility. The need for a social media outreach program to increase the patient recruitment is highlighted by the channels Pfizer’s REMOTE program has adopted.  On similar lines, social media “like” features can be enabled on the collaboration and communication platform to be adopted. This can increase the accessibility and improve the response times from the patients. On similar lines, if the collaboration and communication platform can be made available over mobile devices like smart phones, tablets etc. the patient compliance and response times will improve considerably. Tools like Microsoft SharePoint make it easy to enable the social features and also deliver content to mobile devices.

Conclusion

Overall the ability of a sponsor or CRO organization increases tremendously to virtualize their clinical trial process by leveraging collaboration and content management tools. The overall “Risk-Based” approach to site monitoring and source document verification will also be “made easy” through these tools.  As noted in my previous posts, leveraging a tool like SharePoint for such purpose will improve the Return on Investment (ROI) and reduce the Total Cost of Ownership (TCO) of these tools.

As always, your feedback and comments are welcome.

Update, 20-Mar-2012:

This post has been picked up by www.AppliedClinicalTrialsOnline.com and published on their website.

SharePoint in Life Sciences Value Chain

I was putting together a presentation for a session on “Enterprise Collaboration and Content Management” earlier this week. As part of the exercise I was making a list of point solutions, in each phase of the Life Sciences value chain, that can be developed using SharePoint. This sources for this list are various articles and publications, customer requests, actual applications that I have seen or delivered to customers and last but not the least, from my own brain. I thought of sharing the list here so it could be useful for some of you and might trigger your imagination on leveraging your SharePoint investments.

SharePoint in Life Sciences Value Chain

SharePoint in Life Sciences Value Chain

My intention is neither to say SharePoint will be a sophisticated solution for all the business needs mentioned above  nor that this list is exhaustive. I would like to argue that using SharePoint will definitely be an upgrade from age-old manual and/or paper based process which is still the norm in many organizations. For those who are skeptical about SharePoint being compliant with regulatory requirements like 21 CFR Part 11, I want you to know that it can be  “Validated”.  May be, that is a topic for another blog post in the future.

As always, your comments and feedback are welcome.

Safety Signal Management process automation using Microsoft SharePoint

In recent years the Life Sciences industry has seen many product recalls and withdrawals from market due to safety concerns from patients and the resultant actions by regulators. Pharmacovigilance has become a more proactive process as opposed to a more reactionary process in the past.

Electronic data collection and submission of Adverse Events (AEs) in recent years has enabled organizations to collect huge data sets of safety information. Some pharmaceutical organizations have invested heavily on building or procuring IT systems that can analyze these datasets and produce safety observations that have the potential to become signals. This method of applying mathematical / scientific statistical algorithms to identify potential signals is termed as ‘Quantitative Signal Detection’. Another way of identifying safety observations is the manual process of reviewing all cases being reported, by a medical professional, to identify potential signals. This process is termed as ‘Qualitative Signal Detection’. Many small to medium organizations employ this method of detecting potential signals. More than the size, it is the volume of cases that need to be analyzed that would be a key criterion for organizations leaning one way or the other. Some organizations may also employ a hybrid model i.e. quantitative analysis followed by qualitative analysis.

Microsoft SharePoint is a very popular collaboration and content management platform adopted by many organizations across the globe. Life Sciences organizations are no exception to this trend. In recent years SharePoint has seen rapid growth in even areas that require more cautious adoption of new technology due to the impending regulatory requirements (e.g. 21 CFR Part 11 compliance). However, due to the efforts from Microsoft as well as other IT and Life Sciences organizations SharePoint is being adopted in areas where it requires validation.

Signal Management, the process that ensues once the Signals are detected using one of the two methods i.e. Qualitative or Quantitative Signal Detection, is the process where the information related to a safety observation and the corresponding cases that lead to the observation are reviewed by professionals and scientists and decision is made on whether the signal is refuted or verified (other states are also possible).  The information is usually captured in forms and passes on from one state to another following a specific workflow with associated service levels for each step in the process. This yields itself as a potential candidate for automation. SharePoint natively supports configuration of forms, management of documents as well as simple workflows. As mentioned above, many organizations have already made investments in SharePoint and hence will be able to save costs by leveraging this tool for Signal Management automation.

Update on 02/15/2012: Please refer to an article by me, on this topic, published by “Drug Safety Directions”  here: Signal Management using SharePoint

Cheers,

Venu.

Regulatory Document Management using SharePoint – 5 reasons to procure a COTS system

I was asked to provide inputs to help in the decision-making process by a customer of ours who were trying to figure out if they should leverage native SharePoint for their Regulatory Document Management or look for a Commercial-Off-The-Shelf (COTS) package built on top of SharePoint. I have given 5 reasons from my list, below:

1.       Customization Vs Configuration

SharePoint, implemented as a Document Management System would provide the ability to store and retrieve documents along with other features like version control etc. However, in order to leverage it for regulatory document management a lot of customization is required in terms of workflow, terminology, metadata, reports etc. In contrast a COTS system comes with flexible, predefined workflow and other features required for regulatory document management system of a Life Sciences organization’s requirement.

2.       Rapid Implementation

Traditional SharePoint roll-out may take months to implement and configure it for regulatory requirements like 21 CFR Part 11, Electronic/Digital Signatures etc. On top of that, customization of SharePoint is required to adopt it for regulatory document management. A COTS system implementation is typically completed much faster including validation of the system to comply with regulatory requirements such as 21 CFR part 11.

3.     Electronic Signatures, Audit Trail and 21 CFR Part 11 compliance:

Most COTS systems are available with e-signatures that are required as part of 21 CFR Part 11 compliance. A regular SharePoint implementation demands additional effort to be spent to configure this feature such that it is applicable to regulatory document management. The COTS systems also provide audit trail feature which is then leveraged in reporting and assists in audit process.

4.       Metadata and Terminology

Meaningful metadata is the key to be able to search and retrieve documents easily. Also, the metadata needs to have fields that are domain specific rather than generic. Traditional SharePoint implementation does not define the metadata required to store regulatory documents whereas most COTS Regulated DMS’ come with predefined metadata for regulatory document management and also has provision to configure these based on organizational requirements.

5.       Reports and Audit

Most COTS systems provides reporting capability out of the box. However, these reports are not available in standard SharePoint implementation.

While Cost, Vendor Dependency etc. could be reasons why customers might want to use SharePoint out-of-the-box with some configuration, it will never be the same as a purposefully  designed and developed commercial system. As always, feedback and comments are welcome.

%d bloggers like this: