Tag Archive | Pharmacovigilance

Oracle Argus Safety and Adverse Event Reconciliation

Adverse Events / Adverse Drug Reactions are imperative to all interventional therapies, be it drugs, devices, vaccines or biologics. The frequency, seriousness, breadth etc. may vary from drug to drug, person to person. We have made a lot of progress in ensuring that all the adverse events are identified, processed and reported to regulators. However there are still a lot of challenges in ensuring consistency, of how this is done across organizations, in terms of people, process and technology.

Oracle’s Argus Safety Suite is a leading drug safety system in the market. It is a very good application with rich features. However, there are still certain functions, the industry needs, that needs to mature and some others that are still evolving. I would like to write about one such features i.e. Adverse Event Reconciliation. The module in Argus Suite that provides this functionality is “Argus Reconciliation”. The datasheet lists the benefits of reconciliation and the ability of this module to make it easy, to reconcile the AE data between Argus and other Clinical Data Management systems.

What is reconciliation?

Reconciliation is typically the process of identifying any discrepancies in the data captured for the Adverse Events in Clinical Data Management system and Safety System.

Why do they have similar data in two systems?

Adverse event data is captured in CDM systems as part of the clinical trial data collection process. This data is also entered in Safety Systems in order to capture, process and report it to regulators. Sponsors should ensure that the data that is submitted to regulators during the course of the trial and the data that is submitted as part of the overall submission are consistent. Hence, reconciliation of data is essential. Ideally this situation should not arise if the data is collected electronically and the systems are integrated so the information flows bi-directionally. However, that is not the case in real world.

For customers that have Argus Safety there are essentially three options for reconciliation:

  1. Manual
  2. Automated  (COTS) and
  3. Automated (Custom)

Manual: This method, to a large extent is self-explanatory. One has to extract the AE records from the Safety and CDM systems and compare the data elements line item-by-line item. Any discrepancies identified may lead to a) change to the data in CDM system or b) change to the data in Safety system

Automated (COTS): This method can be used in case a commercially available integration exists between the CDM system and Argus. If we look at some of the popular CDM systems in the market, InForm (Oracle), Oracle Clinical and Rave (Medidata) two are from Oracle. The following information outlines the integration in case of each CDM system:

1)      In case of Oracle Clinical, the reconciliation is available through the Argus Reconciliation module. Customers have to buy licenses to this module as part of the Safety Suite in order to leverage this functionality.

2)      For Inform to Argus integration, Oracle has released a Process Integration Pack (PIP) that is part of their Application Integration Architecture (AIA), which in turn is part of their Fusion Middleware strategy. This essentially requires customers to install an AIA foundation pack and then purchase the PIP (Oracle® Health Sciences Adverse Event Integration Pack for Oracle Health Sciences InForm and Oracle Argus Safety) and install/configure it.

3)      Medidata Rave’s Safety Gateway product can be leveraged for integration between Rave and Argus Safety. This is basically an E2B based integration.

Automated (Custom): In cases where the volume of cases is very high, which eliminates the manual option, and a COTS integration does not exist, customers may have to rely on a custom integration. This can be accomplished in multiple ways. However, an E2B based integration is recommended.

Hope this post helps you get basic knowledge about AE reconciliation and options available for reconciliation between Argus Safety and three popular Clinical Data Management systems. As always, your feedback will be very valuable and welcome.

Advertisements

Safety Analytics – What does Oracle Argus Analytics bring to the table?

Many large Drug Safety organizations have initiatives to leverage the historical data available to them and measure the operational inefficiencies of their processes. In my opinion these initiatives are worth spending your dollar. With the increased scrutiny by regulators on patient safety, drastic increase of data sources and dwindling budgets, the more you know about how good or bad you are at managing your processes to handle safety cases is worth the investment.The challenge always is with deciding what do you want to measure beyond your typical KPIs that you’d get from standard reports made available by the drug safety system vendors. Also, how flexible is the tool in making it easy for your team to define additional KPIs, create dashboards/reports and combine historical data with current data to perform comparative analysis.

Oracle has quickly established themselves as a leader in the Drug Safety space through acquisition of Relsys and Phaseforward. They probably have 3x the customers compared to their nearest competitor for all of their three safety systems. While this creates confusion in the short term, Argus Safety is emerging as the strategic product that they would support and continue enhancements in the long run. One of the initiatives that has come to fruition around this tool is Argus Analytics (formerly Oracle Pharmacovigilance Analytics ot OPVA) which has been available as a general release for close to 2 years now. While it demands additional investment from sponsors, I think it is a good starting point for any sponsor or CRO trying to measure their operational efficiency, identify bottlenecks and improve their decision making process.

I will keep this post brief and not go into the details of each KPI/Dashboard that is available in Oracle Argus Analytics and give a list of dashboards available. Many of these are self explanatory. More information can be found in the user guide available online here: http://docs.oracle.com/cd/E35225_01/doc.11/e29106/toc.htm

Dashboard Type Filter Pages (Tabs)
Case Processing History Trailing Enterprise ID Case Processing Volume History
Case Processing Compliance History
Workflow State Repetition History
Case Processing Management Current Enterprise ID Case Processing Volume Management
Case Processing Compliance Management
Workflow State Compliance Management
Personal User Dashboard Trailing & Current Enterprise ID Personal User Case History
Personal User Case Management
Personal User Case Work History
Personal User Expedited Report History
Personal User Expedited Report Management
Expedited Report History Expedited Submission Volume History
Expedited Non-Submission Volume History
Expedited Submission Compliance History
Expedited Report Management Expedited Submission Volume Management
Expedited Submission Compliance Management
Expedited Failed/Pending ACK Volume Management
Case Work History Case Modified History
Case Unmodified History
Case Read History
Case Idle History

Hope this gives a high level idea about the various dashboards available in Argus Analytics.

Predictive Analytics : Capacity planning for fluctuating Safety Case Volumes

Wikipedia defines Predictive Analytics, as “Predictive analytics encompasses a variety of statistical techniques from modeling, machine learningdata mining and game theory that analyze current and historical facts to make predictions about future events.” While not all of the techniques stated above are required, a lot of data mining and statistical analysis need to be performed on historical data before one can predict the future trends and outcomes. The accuracy of the prediction depends on the variables and assumptions considered and will be the key to making accurate predictions of risks and opportunities.

Case in point, the volume of cases that come in for a product safety case processing organization varies depending on many factors. The variance could be due to factors like Seasonality of Adverse Events, A news item discussing potential side effects of a product, A blog post by a physician or some influential group or organization so on and so forth. With the ever increasing cost pressures on life sciences organizations, it is very difficult to plan for peak volumes while there will be additional unused capacity during the troughs. This is the kind of situation where any organization can use some predictability so they can plan the capacity within a reasonable deviation thus normalizing the peaks and troughs.

Imagine an business intelligence solution that can mine the historical case volumes and the corresponding capacity in conjunction with the process efficiency and be able to predict the future capacity requirements. Add the ability to evaluate some ‘What if” scenarios where one can change variable like case volumes and be able to predict the capacity requirements. While this may sound like a “Holy Grail”, it is possible with some of the sophisticated tools available. A screenshot of one such solution below:

Predictive Analytics - Product Safety Capacity Planning

possible solution for capacity planning for a product safety organization

Do you have a need for such solution in your organization? Have you built a predictive analytics solution for other purposes? Please share your feedback and inputs.

Does Clinical Data qualify as “Big Data”?

I was at an Analyst conference last week where I met a couple of analysts (no pun intended :-)) focused on Life Sciences who felt that “Big Data” is a tough sell in Life Sciences, except for Genomic Data. That made me think. I always associated “Big Data” with the size of the data sets running into Peta Bytes and Zetta Bytes. What I learned in my journey since then is that the characteristics of Big Data does not start and end with the Size.

This article on Mike 2.0 blog by Mr. Robert Hillard, a Deloitte Principal and an author, titled “It’s time for a new definition of big data” talks about why Big Data does not mean “datasets that grow so large that they become awkward to work with using on-hand database management tools” as defined by Wikipedia. He goes on to illustrate three different ways that data could be be considered “Big Data”. For more, please read the blog.

One quality he explained that is of interest to me is “the number of independent data sources, each with the potential to interact”. Why is it of interest to me? I think Clinical Data, in the larger context of Research & Development, Commercialization and Post Marketing Surveillance definitely fits this definition. As explained in one of my previous posts title “Can Clinical Data Integration on the Cloud be a reality?“, I explain the diversity of clinical data in the R&D context. Now imagine including the other data sources like longitudinal data (EMR/EHR, Claims etc.), Social Media, Pharmacovigilance so on and so forth, the complexity increases exponentially. Initiatives like Observational Medical Outcomes Partnership (OMOP) have already proven that there is value in looking into data other than the data that is collected through the controlled clinical trial process. Same thing applies to some of the initiatives going on with various sponsors and other organizations in terms of making meaningful use of data from social media and other sources. You might be interested in my other post titled “Social Media, Literature Search, Sponsor Websites – A Safety Source Data Integration Approach” to learn more about such approaches that are being actively pursued by some sponsors.

All in all, I think that the complexities involved in making sense of disparate data sets from multiple sources and analyzing them to make meaningful analysis and ensure the risks of medicinal products outweigh the benefits will definitely qualify Clinical Data as “Big Data”. Having said that, do I think that organizations would be after this any time soon? My answer would be NO. Why? The industry is still in the process of warming up to the idea. Also, Life Sciences organizations being very conservative, specially when dealing with Clinical Data which is considered Intellectual Property as well as all the compliance and regulatory requirements that goes with the domain, it is going to be a long time before it is adopted. This article titled “How to Be Ready for Big Data” by Mr. Thor Olavsrud on CIO.com website outlines the current readiness and roadmap for adoption by the industry in general.

The next couple of years will see evolution of tools and technology surrounding “Big Data” and definitely help organizations evolve their strategies which in turn will result in the uptick in adoption.

As always your feedback and comments are welcome.

Safety Signal Management process automation using Microsoft SharePoint

In recent years the Life Sciences industry has seen many product recalls and withdrawals from market due to safety concerns from patients and the resultant actions by regulators. Pharmacovigilance has become a more proactive process as opposed to a more reactionary process in the past.

Electronic data collection and submission of Adverse Events (AEs) in recent years has enabled organizations to collect huge data sets of safety information. Some pharmaceutical organizations have invested heavily on building or procuring IT systems that can analyze these datasets and produce safety observations that have the potential to become signals. This method of applying mathematical / scientific statistical algorithms to identify potential signals is termed as ‘Quantitative Signal Detection’. Another way of identifying safety observations is the manual process of reviewing all cases being reported, by a medical professional, to identify potential signals. This process is termed as ‘Qualitative Signal Detection’. Many small to medium organizations employ this method of detecting potential signals. More than the size, it is the volume of cases that need to be analyzed that would be a key criterion for organizations leaning one way or the other. Some organizations may also employ a hybrid model i.e. quantitative analysis followed by qualitative analysis.

Microsoft SharePoint is a very popular collaboration and content management platform adopted by many organizations across the globe. Life Sciences organizations are no exception to this trend. In recent years SharePoint has seen rapid growth in even areas that require more cautious adoption of new technology due to the impending regulatory requirements (e.g. 21 CFR Part 11 compliance). However, due to the efforts from Microsoft as well as other IT and Life Sciences organizations SharePoint is being adopted in areas where it requires validation.

Signal Management, the process that ensues once the Signals are detected using one of the two methods i.e. Qualitative or Quantitative Signal Detection, is the process where the information related to a safety observation and the corresponding cases that lead to the observation are reviewed by professionals and scientists and decision is made on whether the signal is refuted or verified (other states are also possible).  The information is usually captured in forms and passes on from one state to another following a specific workflow with associated service levels for each step in the process. This yields itself as a potential candidate for automation. SharePoint natively supports configuration of forms, management of documents as well as simple workflows. As mentioned above, many organizations have already made investments in SharePoint and hence will be able to save costs by leveraging this tool for Signal Management automation.

Update on 02/15/2012: Please refer to an article by me, on this topic, published by “Drug Safety Directions”  here: Signal Management using SharePoint

Cheers,

Venu.

%d bloggers like this: