Healthcare and Life Sciences

The combination of increasingly powerful computers and AI offers the possibility to be able to detect, diagnose, and cure diseases like never before. At IBM Research, we’re working on creating software and AI systems that can convert reams of health data into useable information for clinicians the world over.

IBM and Cleveland Clinic unveil the first quantum computer dedicated to healthcare research

  • Accelerated Discovery
  • Quantum Network
  • Quantum Systems

IBM Research and JDRF continue to advance biomarker discovery research

  • Life Sciences

Accelerating discoveries in immunotherapy and disease treatment

Why now is the time to accelerate discoveries in health care.

Why now is the time to accelerate discoveries in health care

IBM is partnering with the Oxford Pandemic Sciences Institute

IBM is partnering with the Oxford Pandemic Sciences Institute

Computer simulations identify new ways to boost the skin’s natural protectors

Materials discovery.

  • Physical Sciences
  • See more of our work on Healthcare and Life Sciences

Publications

  • Niharika S. D‘Souza
  • Hongzhi Wang
  • Medical Image Analysis
  • Vaishnavi Subramanian
  • Tanveer Syeda-Mahmood
  • Artificial Intelligence in Medicine
  • Simona Rabinovici-Cohen
  • Neomi Fridman
  • Shai Ginsburg
  • Patil Korkeen
  • Jiaoyan Chen

Urgency of Science

Research leads IBM’s response to COVID-19

To meet the global challenge of COVID-19, the world must come together. IBM has resources to share — like supercomputing power, virus tracking systems, and an AI assistant to answer citizens’ questions.

Related topics

Conversational ai, fairness, accountability, transparency, explainable ai.

Here’s How IBM Watson Health Is Transforming the Health Care Industry

Watson computer at IBM in New York City

Imagine if you had a rare, undiagnosed disease that’s stumped doctor after doctor. What if there were a single, secure database that could read your symptoms then run through thousands of clinical studies, similar patient records, and medical textbooks to present a risk-matched list of potential diseases?

Just one year after its launch, IBM Watson Health is already starting to make this seemingly impossible task a reality, thanks to its powerful cognitive computing platform and a wide-reaching partnership strategy.

Watson’s vision is to enable better care by surfacing insights from the massive amounts of personal and academic health data that’s being generated every day, but IBM (IBM) needs partners within the medical, pharmaceutical, and hospital fields to make that relevant to on-the-ground practitioners. It’s institutions and companies like the Mayo Clinic, CVS Health (CVS), and Memorial Sloan Kettering Cancer Center that are adapting the innovative new technology to real-life applications.

UNDER ARMOUR AND IBM TO TRANSFORM PERSONAL HEALTH AND FITNESS, POWERED BY IBM WATSON

“No one company is big enough to transform an industry on its own,” says Kathy McGroddy, vice president of IBM Watson Health. “It takes a village to change.”

One of IBM’s tentpole program within health care is the Watson for Oncology application developed in partnership with New York’s Memorial Sloan Kettering Cancer Center (MSK).

Some MSK oncologists have a highly specific expertise in certain cancers. By training Watson to think like they do, that knowledge expands from one specialist to any doctor who is querying Watson. That means that a patient can get the same top-tier care as if they traveled directly to the center’s offices in Manhattan. IBM’s Watson provides the framework to learn, connect, and store the data, while MSK is imparting its knowledge to train the computer.

The app, which can be run on an iPad or other tablet, is able to pack in all the expertise of MSK oncologists into one place so that any doctor anywhere is able to provide elite cancer care. This is significant for patients who live in areas without world-class medical services, like lower-income countries or rural America.

ibm healthcare case studies

“The handwriting was on the wall. This kind of a concept was not an ‘if’ question but a ‘when’ question,” said Mark Kris, a medical oncologist at MSK and the lead physician for the institution’s IBM Watson collaboration. “We knew we wanted to be part of the team that developed it.”

IBM and MSK have been closely linked for many years, making their partnership an natural evolution. Both IBM CEO Ginni Rometty and former IBM chairman Lou Gerstner sit on MSK’s board. However, it was MSK that first approached IBM about using Watson after watching the computer defeat two past grand-champions on the Jeopardy! TV quiz show, David Kerr, director of corporate strategy at IBM, wrote in an article for the Huffington Post in 2012.

“I credit leaders at Memorial Sloan-Kettering for envisioning a way to have a huge impact on cancer treatment worldwide,” wrote Kerr.

Revolutionizing Access

On the ground, the partnership means that any doctor anywhere can query Watson for Oncology on the iPad app, if the hospital has licensed the program.

For example, say a patient has a rare, genetically linked form of lung cancer. A generalized cancer doctor likely hasn’t had the time to keep up with the latest in specific lung cancer treatments. In the last year alone, there have been at least seven new lung cancer drugs approved by the FDA. That doctor may not be aware of how best to use those drugs or even if they apply to this patient.

At the University of Texas MD Anderson Cancer Center, Assistant Professor of Leukemia Dr. Courtney DiNardo uses IBM's Watson cognitive system while consulting with patient Rich Ware.

Meanwhile, Watson for Oncology has been fed previous case studies on patients like this by lung cancer specialists at MSK, so it understands the case and will spit out a list of potential treatments for the doctor, with a percentage rank of certitude and risk next to each option. The doctor then reviews the list and makes the final treatment decision in consult with the patient.

It’s this level of specificity that is transformative for both doctors and patients, taking centralized expertise and fanning it out across areas as far as India and Thailand, where Watson for Oncology is already being used in select hospitals.

“If it doesn’t get to people who benefit, it’s just irrelevant,” said Kris. “This isn’t meant to be theoretical.”

IBM and MSK didn’t release details of the joint venture but there is a shared revenue agreement in place, Kris said, similar to how licensing agreements are struck between pharmaceutical companies for new drugs.

Beyond Cancer

While Watson Health collaborates with more than a dozen cancer institutes to find new ways to treat the disease using genomic data, it’s partnerships also expand well beyond cancer care. It is working with CVS Health to use predictive analytics to transform care management for patients with chronic diseases, an important way to extend ongoing medical care beyond the standard doctor’s office.

Other Watson Health partners include:

  • Medtronic (MDT): Predicting hypoglycemic episodes in diabetic patients nearly three hours before its onset, preventing devastating seizures.
  • Apple (AAPL): Storing and analyzing ResearchKit data.
  • Johnson & Johnson(JNJ): Analyzing scientific papers to find new connections for drug development.
  • Under Armour (UA): Powering a “Cognitive Coaching System” that provides athletes coaching around sleep, fitness, activity and nutrition.

Each of these programs are an equal partnership between IBM and the other company. The two work hand-in-hand to train Watson and establish a functional platform to query the super computer—and each has its own unique business model.

“What’s happening in health care is new things, new ideas, and new models. There’s an opportunity to really look at different ways to create and share value together,” says McGroddy, who declined to release details of those business arrangements.

On August 6, 2015 IBM announced that Watson will to gain the ability to "see" by bringing together Watson’s advanced image analytics and cognitive capabilities with data and images obtained from Merge Healthcare Incorporated’s medical imaging management platform.

On August 6, 2015 IBM announced that Watson will to gain the ability to “see” by bringing together Watson’s advanced image analytics and cognitive capabilities with data and images obtained from Merge Healthcare Incorporated’s medical imaging management platform.

The Watson Health Ecosystem

IBM also opens up its platform to allow other companies to tinker or develop their own programs. Watson makes its capabilities available as an API (application program interface), so companies of all sizes can use cognitive computing to meet their needs. Developers, entrepreneurs, data hobbyists, and students have built more than 7,000 applications through the Watson Ecosystem.

Welltok is one of those examples. The company, which IBM has invested in, works directly with self-insured companies and insurers to find ways to engage customers so they choose healthier behaviors. It’s CafeWell Concierge app uses Watson’s cognitive computing capabilities to provide customized insights to patients based on their personal profiles, helping them with things like finding a healthy restaurant nearby or answering detailed questions about their health.

Ultimately, these large and small partnerships are meant to give IBM’s Watson Health the breadth to help patients and doctors make better decisions. The goal is to eventually deliver more effective care, whether through quicker drug development, personalized health recommendations, or uncovering new genetically-linked treatments.

“These are huge benefits,” says Rob Merkel, vice president of IBM Watson’s health group. “This isn’t like a traditional business reengineering process where you’re cutting a few points of inefficiencies and saving money. You’re talking about fundamentally changing people’s lives.”

Transforming Big Blue

Watson is also intended to be an engine of transformation within IBM itself, a 104-year-old company that has been in the Fortune 500 for the past 21 years. The company has been honing its cognitive computing division across subject areas—from financial services to artificial intelligence to health care—and Watson’s role in Big Blue is becoming more vital.

“The vision for Watson Health is to serve as a catalyst to save and improve lives around the world and lower costs through cognitive computing,” says McGroddy. “Overall, from a business standpoint, we need to get this very big, very fast. It’s not just about transforming health care, it’s about the transformation of IBM, as well.”

Rometty has continually stressed the importance of cognitive computing at large, calling it “the dawn of a new era.” IBM launched the Watson Group in 2014, followed by the health care-dedicated division a year later.

Watch Rometty explain her vision for Watson in this 2014 interview:

[fortune-brightcove videoid=3447758385001 width=”840″ height=”484″]

Rometty has devoted more than $4 billion to build out Watson Health’s platform via acquisitions, indicating just how important this business is to the company. Most recently, Big Blue paid $2.6 billion for Truven Health Analytics , which will bring its total database to 300 million patients; the deal will also double IBM Watson’s size to nearly 5,000 employees.

However, in terms of Watson’s profit potential, IBM hasn’t said exactly how much it contributes to the company’s bottom line. Mike Rhodin, a senior vice president of the IBM Watson Group, told Fortune in September that Watson “is a big part of our growth in overall analytics which was $17 billion last year.”

There is no question that technology and big data have the potential to transform health care, in particular. One patient’s electronic health record holds an average of about 400 gigabytes worth of information. If you add in a patient’s genetic information, that ups a person’s total health data to 6 terabytes, says Merkel. Then, think about cross referencing an individual’s total data with existing medical literature or even thousands of other similar patients.

“It’s beyond human cognition to read that much information,” says Merkel. “We’re trying to provide insight across these two realms: knowledge and data. Ultimately, we want to impact people’s lives by providing new levels of insight.”

Sign up for the  Fortune Features email list so you don’t miss our biggest features, exclusive interviews, and investigations.

TreehillPartners-Logo

The Future of IBM’s Watson Health – A Case Study of Pioneering

  • September 23, 2019
  • Posted by: TreehillPartners
  • In: Leadership & Restructuring , Strategy & Corporate Development

IBM-Watson-Techvibes

IBM is known as one of the world’s computing juggernauts. So when IBM first announced its artificial intelligence (AI) question-answer based project Watson (named after IBM’s first CEO Thomas J. Watson ) in 2010 there was a lot of interest into potential practical uses for the technology, specifically in the health sector. According to IBM, “The goal is to have computers start to interact in natural human terms across a range of applications and processes, understanding the questions that humans ask and providing answers that humans can understand and justify.”

In 2011, Watson wowed the public relations crowd by competing on the legendary USA television quiz show Jeopardy. Over three matches, IBM Watson won Jeopardy’s covotted 1M USD prize.  IBM’s Christopher Padilla said of the match, “The technology behind Watson represents a major advancement in computing. In the data-intensive environment of government, this type of technology can help organizations make better decisions and improve how government helps its citizens.”

Since Jeopardy, Watson has had numerous other public relation scores announcing several industry health partnerships for Watson. 

Despite seemingly early success for Watson, nearly a decade has passed without any real major practical Watson successes at scale. As public perception noticed a growing gap between Watson’s promise of “ushering in system wide healthcare innovation through the use of big data” and actual progress, the company decided to make leadership changes. Enter Paul Roma, IBM’s Watson Health newly appointed general manager late July 2019. 

“Going forward, the word is focus,” says Roma. Reading between the lines, it seems that many of IBM Watson’s health projects were not successful as anticipated.  “We’re going to double down on what’s working, and we’re going to get super focused on execution… On a back-to-basics level, if we can execute on what’s in front of us with the clients that we have now, we’re going to be really successful,” Roma went on to say.  

Perhaps IBM’s Watson does have a prosperous future ahead of it given the general fact that healthcare still has all the unmet needs Watson promised to innovate years ago. Roma’s new role as health GM for Watson comes with a successful prior tenure at Ciox, a informational technology company currently handling medical records data for over 3,000 hospitals across the United States. Before Ciox, Roma was Deloitte’s global consultancy practice analytics chief. 

What industry in general and Watson specifically has lagged so far is really understanding, and utilizing, the value of health data and resulting possibilities to at scale improve tools doctors use in the field and yield the best solutions for patient care. Roma comments “…it is my belief that we’re at a point in time, both in technology and medicine, where the technology needs to do better at helping doctors and patients interpret all this information at the right time, and give them a better way to approach their healthcare.” Some others say we are yet decades away from achieving this goal and so far it has remained hard to prove them wrong.

“It is clear to me that the challenges in front of us require an integrated platform,” IBM’s Roma says. “You have to have a deep bench, not just in technology experience, but also in the ability to productize it and bring it to the market at scale.”

While pharmaceutical product development is a lengthy process of identifying and measuring patient benefit, innovating an entire industry through complex integrated technology solutions that span across types of value chain participants, is not often something that can be achieved single handedly either. 

We will see the success or failure of this pioneer program and continue working with clients to build strategies, partnerships and financings to take forward their own innovations. 

  • Leadership & Restructuring , Strategy & Corporate Development

Subscribe to our mailings!

Email address:

Please choose your subscriptions:

News Roundup

Viewpoint Articles

Customized Online Advertisement

Unsubscribe any time using the footer of our emails. For information about our privacy practices, please visit our privacy policy .

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.

Privacy Overview

The Cole St John Blog

IBM Cloud Case Study

I chose this IBM case study because it provides a clear overview of how legacy systems can be modernized through cloud adoption. The consulting firm Deloitte partnered with IBM to improve the Medicaid Management Information Systems (MMIS). The collaboration resulted in an innovative solution for state healthcare agencies to align with CMS guidelines and adopt a more modular approach to enterprise IT (Deloitte Consulting | Ibm, 2024, p. 2). As the case study outlines, the MES platform was specifically designed to help state healthcare agencies transition away from the old cumbersome Medicaid Management Information Systems.

State healthcare agencies can reduce costs and save time by implementing the new MES platform. State agencies can update individual components as needed using the MES platform’s modular architecture, thus avoiding full-scale system overhauls (Deloitte Consulting | Ibm, 2024, p. 2). Moreover, the cloud-based deployment model allows agencies to rapidly scale resources while maintaining robust security controls (Deloitte Consulting | Ibm, 2024, p. 2). According to Brian Erdahl, Principal and Market Offering Leader for Deloitte, “The beauty of MES is that if an agency ends up not liking the functionality of a particular provider module, the organization can pull it and easily deploy a new module without disrupting what’s going on in claims processing or other modules that have been implemented,” (Deloitte Consulting | Ibm, 2024, p. 2). This flexibility enables state healthcare agencies to optimize their systems continuously without facing operational disruptions.

Despite the numerous benefits offered by cloud computing, companies must also be mindful of its challenges and drawbacks. One concern is the reliance on internet connectivity, which is vital for the seamless operation of cloud-based systems. Knowing that a simple internet outage could lead to productivity losses, customers must ensure to have a backup plan in place. Additionally, another primary concern of implementing cloud-based technology is security. This means IBM must uphold stringent cyber security standards to protect the data of its customers.

I believe Deloitte made a great decision to partner with IBM to develop a cloud-based solution for the Medicaid Management Information System. By utilizing cloud computing, state healthcare agencies can achieve significant cost savings and operational efficiencies. Despite the risks associated with cloud computing, “the VMware Cloud Foundation on the IBM Cloud platform offers firewall services, virtual SAN components, and other virtual resources for the rapid deployment of the MES solution” (Deloitte Consulting | Ibm, 2024, p. 2). Hence, the benefits of the IBM Cloud solution, along with the combined expertise of Deloitte and IBM, outweigh the potential challenges, making it a strategic choice for modernizing Medicaid Management Information Systems.

Deloitte consulting | ibm. (2024, February 1). https://www.ibm.com/case-studies/deloitte-consulting

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Turkish Cargo 2024

Case Study: IBM - i3ARCHIVE

  • Deployment Country
  • Pennsylvania - USA
  • Healthcare, Life Sciences
  • Business-to-Business, Business-to-Consumer, Business Performance Transformation, Business Resiliency, Content Management, Data Warehouse, Database Management, Digital Media, Grid Computing, Innovation that matters.
  • i3ARCHIVE needed to give healthcare providers a means of managing and transporting digital medical images that fit seamlessly into their operations.
  • Business need:
  • Rising costs require healthcare providers to seek flexible, cost-effective solutions.
  • Using grid technology, i3ARCHIVE created a system that enables providers and researchers to "plug in" to its base of two million images for more accurate testing and diagnosis.
  • Fifty percent reduction in a hospital's overall medical imaging transport costs, resulting in savings of up to US$1M annually for larger hospitals.

"Hospitals are yearning for more on demand services so they can focus less on IT and more on healthcare ... With IBM's help and technology, we're giving them the services and information they need to change the way they provide care."-- Derek Danois, CEO, i3ARCHIVE.

On Demand Business defined

An enterprise whose business processes—integrated end-to-end across the company and with key partners, suppliers and customers—can respond with speed to any customer demand, market opportunity or external threat.”

  • Why Become an On Demand Business?
  • Key Benefits

In recent years, the healthcare industry has witnessed a dramatic growth in its information technology investments. Driven by necessity and enabled by innovative new technologies and approaches, healthcare institutions are increasingly looking to IT as the cornerstone of a new way of delivering healthcare services. Clinicians and researchers see powerful processing resources combined with sophisticated analytical tools as a way to uncover better ways of diagnosing and treating illnesses.

This infusion of IT into clinical activities has already produced a surge of dramatic results and even higher expectations for the future. Put simply, the tools are within reach for clinicians and researchers to find and understand the underlying patterns of disease like they never could before.

On Demand Business Benefits

  • Fifty percent reduction in a hospital’s overall medical imaging transport costs, resulting in savings of up to US$1 million annually for larger hospitals
  • Reduction in time to deliver medical files to physicians’ offices from days to minutes
  • Improved diagnostic accuracy
  • Faster FDA approval of imaging devices by leveraging NDMA data
  • Ability of patients to access personal health information on demand and the inherent cost savings associated with self-management
  • More efficient allocation of investment resources by hospitals through the avoidance of IT expenditures and support costs

Learning to share

Despite their widening embrace of technology, healthcare institutions – perhaps more than ever -- don't want to be in the information technology business. Despite a clear need for advanced IT capabilities, hospitals and clinics have been compelled by rapidly rising healthcare costs to focus their resources on their core mission. As a result, hospital executives are under growing pressure to avoid investments in costly IT infrastructure and the staff required to support it.

However, this presents a paradox and a challenge for healthcare providers seeking to achieve the vision of information-based medicine outlined above. In the effort to glean insights from patterns within clinical data, the biggest payoff naturally comes from aggregating the same type of data (e.g., test results) from different sources (e.g., hospitals). With clinicians and researchers trying to discern the subtle patterns within the "big picture," the more data points -- or "pixels" -- they can add to the picture, the better their chances of finding these patterns. Medical imaging illustrates this well. While many hospitals are embracing digital imaging technology (including the PACS, or picture archiving and communication systems, that enable them to share imaging files within a given hospital), most hospitals lack the ability to share their medical imaging data with other hospitals. This means the power of pooled data goes largely untapped. With few if any ho spitals ready to face the technical and economic challenges that bridging this gulf entailed, innovation was needed to fundamentally change the equation for hospitals. That's just what i3ARCHIVE did.

“ What we’re doing fits into a trend of hospitals wanting to get out of the IT business. If you talk to hospital CFOs and CEOs, they will tell you the thing they don’t want to be doing is spending lots of money to build up their own IT infrastructures.” – Derek Danois

Based in Berwyn, Pa., i3ARCHIVE originated as a federally funded research project led by the University of Pennsylvania whose goal was to create an open, secure and nationwide repository for digital medical images and data designed to encourage collaboration among hospitals, clinics and researchers. To succeed, i3 had to meet stiff challenges on two levels. First, it needed to create a powerful, flexible and resilient infrastructure capable of handling enormous transaction volumes as large numbers of healthcare facilities across the U.S. added and accessed large digital medical image files to and from the system. An even greater challenge was to deliver this industrial-strength processing capability in a way that was available on demand, completely seamless to the clinicians and researchers using it, and imposed no additional infrastructure or support burdens on facilities that adopted it. The company was able to create an innovative shared-services infrastructure known as the National Digital Medical Archive (NDMA) that puts the power, as well as the management requirement, in the backend. For the front end of the solution, i3 created an interface it calls the WallPlugTM that enables healthcare facilities to procure NDMA-based services by essentially plugging into the enormous, real-time archive running invisibly in the background. The NDMA currently houses over two million patient images and is steadily growing every day.

Plugging into information power

For users, one of the biggest values of i3's approach is that it offers a fundamental change in the way they access leading-edge technology. Most hospitals would agree that they have enough servers, workstations, monitors and software to support and, if given the choice, would prefer to invest in things that generate revenue -- such as a new kind of testing device. i3's offering does this by making the NDMA a seamlessly connected service in a hospital's existing clinical information network. The key is its support for DICOM (Digital Imaging and Communications in Medicine), an industry standard for networked imaging devices in the healthcare industry. Accessing the NDMA's services like a utility, clinicians can either upload their patients' medical images such as X-rays, MRIs and CT scans into the NDMA for storage, or download select files of patients from around the country in order to perform comparisons and make a more accurate diagnosis. Requests are handled by a highly flexible grid-based infrastructure that dynamically shifts processing to any of 64 nodes distributed across three strategically placed hosting locations. For security, a pair of servers within each hospital acts as a buffer between the hospital's internal systems and the backend grid; encrypted data is passed across this link. To enable the service's key value proposition -- that it imposes no support burden on hospital staff or operations -- i3 relied on IBM eServer xSeries servers based on their powerful remote monitoring and management capability. xSeries servers, running IBM DB2 Universal Database Extended Enterprise Edition to perform high-volume parallel processing and DB2 Content Manager to manage the solution's content, also constitute the core of the grid.

Key Components

  • • IBM DB2® Universal Database™ Extended Enterprise Edition • IBM DB2 Content Manager
  • • IBM eServer™xSeries® server • IBM EXP300 STOR Expansion Units
  • • IBM Global Services e-business Hosting™
  • • Upfront development: one year • Expansion/deployment: ongoing

A study in leverage

Having set out to develop a flexible and versatile solution, i3 has exceeded its own initial expectations. One true hallmark of a solution's flexibility is the ability to leverage it -- to use it as a cost-effective base for adding new services or capabilities. Since i3 launched its core NDMA service, examples of such ROI-enhancing leverage have become abundant. A good example is the company's Storage, Disaster Recovery and Business Continuity offerings, which directly leverage the archive and require no added investments in storage media by hospitals. By creating a mirrored image of their PACS system within the archive, hospitals can use the NDMA for real-time failover if their internal systems go down.

Another example of "pure" leverage is i3's recently introduced communications services. Because the NDMA is an open platform, multiple hospitals can plug into it and use it as a conduit to send digital medical image files to and from their PACS systems. Through this service, one hospital expects to save US$1 million annually in processing and courier costs. But perhaps the greatest expression of the NDMA's inherent versatility is its ability to support an entirely new business model. In late 2005, i3 created a standalone business unit known as MyNDMA.com, which gives patients the ability to manage their own digital health records. The portal-based service also enables physicians to access their patients' records electronically, thus sparing patients the burden of physically retrieving and transporting their files to a new doctor in the event of a referral, second opinion or change in carrier. In addition to the obvious improvement in clinical efficiency, myNDMA gives patients an unprecedented degree of accessibility to their medical records and images, as well as the truly groundbreaking ability to participate in the management and security of their personal health records. Like all of its services, MyNDMA was created to meet the concrete demands of hospitals, physicians and patients. The openness and flexibility of the underlying archive enabled i3 to meet these needs rapidly and cost-effectively.

Why it matters

As a rule, clinicians can do a better job of diagnosing, tracking and treating illness when they have more clinical information at their disposal. However, constraints in both IT budgets and support staff have limited their ability to make the investments necessary to gain easier access to it. i3's innovation was to create a shared, on demand service that plugged directly into a hospital's existing medical information networks, thus shielding it from the complexity, costs and support requirements of a new IT infrastructure.

Finally, there's the issue of scale. With medical imaging data spread across unconnected islands, seeing the important patterns that can improve care is effectively impossible. In one of its earliest breakthrough applications, i3 was able to help medical imaging equipment vendors detect subtle gradations in the performance of certain devices over time such as the effect of heat and varying radiation output on image quality -- all with existing, real-time archive data. This gave vendors a solid basis to change the way their machines are configured to give more accurate results.

Overall, CEO Derek Danois sees i3's NDMA as a catalyst to transformation across the healthcare industry. "Hospitals are yearning for more on demand services so they can focus less on IT and more on healthcare, which requires a more efficient IT infrastructure. The NDMA resolves this paradox by delivering state-of-the-art information management services to its customers, built on the reliability and power of IBM technology."

For more information: Please contact your IBM sales representative or IBM Business Partner. Visit us at: ibm.com/ondemand

Products and Services Used

IBM products and services that were used in this case study.

  • xSeries Servers
  • DB2 Universal Database Enterprise Server Edition, DB2 Content Manager
  • IBM e-business Hosting, IBM Global Services

©Copyright IBM Corporation 2006 IBM Corporation Global Solution Sales New Orchard Road Armonk, NY 10504 U.S.A. Produced in the United States of America 6-06 All Rights Reserved IBM, the IBM logo, ibm.com, the On Demand Business logo, DB2, DB2 Universal Database, e-business Hosting, e(logo)server and xSeries are trademarks of International Business Machines Corporation in the United States, other countries, or both. Other company, product, or service names may be trademarks or service marks of others. Many factors contributed to the results and benefits achieved by the IBM customer described in this document. IBM does not guarantee comparable results. ODB-0148-00

Latest Issue

  • Survey paper
  • Open access
  • Published: 19 June 2019

Big data in healthcare: management, analysis and future prospects

  • Sabyasachi Dash 1   na1 ,
  • Sushil Kumar Shakyawar 2 , 3   na1 ,
  • Mohit Sharma 4 , 5 &
  • Sandeep Kaushik 6  

Journal of Big Data volume  6 , Article number:  54 ( 2019 ) Cite this article

442k Accesses

682 Citations

103 Altmetric

Metrics details

‘Big data’ is massive amounts of information that can work wonders. It has become a topic of special interest for the past two decades because of a great potential that is hidden in it. Various public and private sector industries generate, store, and analyze big data with an aim to improve the services they provide. In the healthcare industry, various sources for big data include hospital records, medical records of patients, results of medical examinations, and devices that are a part of internet of things. Biomedical research also generates a significant portion of big data relevant to public healthcare. This data requires proper management and analysis in order to derive meaningful information. Otherwise, seeking solution by analyzing big data quickly becomes comparable to finding a needle in the haystack. There are various challenges associated with each step of handling big data which can only be surpassed by using high-end computing solutions for big data analysis. That is why, to provide relevant solutions for improving public health, healthcare providers are required to be fully equipped with appropriate infrastructure to systematically generate and analyze big data. An efficient management, analysis, and interpretation of big data can change the game by opening new avenues for modern healthcare. That is exactly why various industries, including the healthcare industry, are taking vigorous steps to convert this potential into better services and financial advantages. With a strong integration of biomedical and healthcare data, modern healthcare organizations can possibly revolutionize the medical therapies and personalized medicine.

Introduction

Information has been the key to a better organization and new developments. The more information we have, the more optimally we can organize ourselves to deliver the best outcomes. That is why data collection is an important part for every organization. We can also use this data for the prediction of current trends of certain parameters and future events. As we are becoming more and more aware of this, we have started producing and collecting more data about almost everything by introducing technological developments in this direction. Today, we are facing a situation wherein we are flooded with tons of data from every aspect of our life such as social activities, science, work, health, etc. In a way, we can compare the present situation to a data deluge. The technological advances have helped us in generating more and more data, even to a level where it has become unmanageable with currently available technologies. This has led to the creation of the term ‘big data’ to describe data that is large and unmanageable. In order to meet our present and future social needs, we need to develop new strategies to organize this data and derive meaningful information. One such special social need is healthcare. Like every other industry, healthcare organizations are producing data at a tremendous rate that presents many advantages and challenges at the same time. In this review, we discuss about the basics of big data including its management, analysis and future prospects especially in healthcare sector.

The data overload

Every day, people working with various organizations around the world are generating a massive amount of data. The term “digital universe” quantitatively defines such massive amounts of data created, replicated, and consumed in a single year. International Data Corporation (IDC) estimated the approximate size of the digital universe in 2005 to be 130 exabytes (EB). The digital universe in 2017 expanded to about 16,000 EB or 16 zettabytes (ZB). IDC predicted that the digital universe would expand to 40,000 EB by the year 2020. To imagine this size, we would have to assign about 5200 gigabytes (GB) of data to all individuals. This exemplifies the phenomenal speed at which the digital universe is expanding. The internet giants, like Google and Facebook, have been collecting and storing massive amounts of data. For instance, depending on our preferences, Google may store a variety of information including user location, advertisement preferences, list of applications used, internet browsing history, contacts, bookmarks, emails, and other necessary information associated with the user. Similarly, Facebook stores and analyzes more than about 30 petabytes (PB) of user-generated data. Such large amounts of data constitute ‘ big data ’. Over the past decade, big data has been successfully used by the IT industry to generate critical information that can generate significant revenue.

These observations have become so conspicuous that has eventually led to the birth of a new field of science termed ‘ Data Science ’. Data science deals with various aspects including data management and analysis, to extract deeper insights for improving the functionality or services of a system (for example, healthcare and transport system). Additionally, with the availability of some of the most creative and meaningful ways to visualize big data post-analysis, it has become easier to understand the functioning of any complex system. As a large section of society is becoming aware of, and involved in generating big data, it has become necessary to define what big data is. Therefore, in this review, we attempt to provide details on the impact of big data in the transformation of global healthcare sector and its impact on our daily lives.

Defining big data

As the name suggests, ‘big data’ represents large amounts of data that is unmanageable using traditional software or internet-based platforms. It surpasses the traditionally used amount of storage, processing and analytical power. Even though a number of definitions for big data exist, the most popular and well-accepted definition was given by Douglas Laney. Laney observed that (big) data was growing in three different dimensions namely, volume, velocity and variety (known as the 3 Vs) [ 1 ]. The ‘big’ part of big data is indicative of its large volume. In addition to volume, the big data description also includes velocity and variety. Velocity indicates the speed or rate of data collection and making it accessible for further analysis; while, variety remarks on the different types of organized and unorganized data that any firm or system can collect, such as transaction-level data, video, audio, text or log files. These three Vs have become the standard definition of big data. Although, other people have added several other Vs to this definition [ 2 ], the most accepted 4th V remains ‘veracity’.

The term “ big data ” has become extremely popular across the globe in recent years. Almost every sector of research, whether it relates to industry or academics, is generating and analyzing big data for various purposes. The most challenging task regarding this huge heap of data that can be organized and unorganized, is its management. Given the fact that big data is unmanageable using the traditional software, we need technically advanced applications and software that can utilize fast and cost-efficient high-end computational power for such tasks. Implementation of artificial intelligence (AI) algorithms and novel fusion algorithms would be necessary to make sense from this large amount of data. Indeed, it would be a great feat to achieve automated decision-making by the implementation of machine learning (ML) methods like neural networks and other AI techniques. However, in absence of appropriate software and hardware support, big data can be quite hazy. We need to develop better techniques to handle this ‘endless sea’ of data and smart web applications for efficient analysis to gain workable insights. With proper storage and analytical tools in hand, the information and insights derived from big data can make the critical social infrastructure components and services (like healthcare, safety or transportation) more aware, interactive and efficient [ 3 ]. In addition, visualization of big data in a user-friendly manner will be a critical factor for societal development.

Healthcare as a big-data repository

Healthcare is a multi-dimensional system established with the sole aim for the prevention, diagnosis, and treatment of health-related issues or impairments in human beings. The major components of a healthcare system are the health professionals (physicians or nurses), health facilities (clinics, hospitals for delivering medicines and other diagnosis or treatment technologies), and a financing institution supporting the former two. The health professionals belong to various health sectors like dentistry, medicine, midwifery, nursing, psychology, physiotherapy, and many others. Healthcare is required at several levels depending on the urgency of situation. Professionals serve it as the first point of consultation (for primary care), acute care requiring skilled professionals (secondary care), advanced medical investigation and treatment (tertiary care) and highly uncommon diagnostic or surgical procedures (quaternary care). At all these levels, the health professionals are responsible for different kinds of information such as patient’s medical history (diagnosis and prescriptions related data), medical and clinical data (like data from imaging and laboratory examinations), and other private or personal medical data. Previously, the common practice to store such medical records for a patient was in the form of either handwritten notes or typed reports [ 4 ]. Even the results from a medical examination were stored in a paper file system. In fact, this practice is really old, with the oldest case reports existing on a papyrus text from Egypt that dates back to 1600 BC [ 5 ]. In Stanley Reiser’s words, the clinical case records freeze the episode of illness as a story in which patient, family and the doctor are a part of the plot” [ 6 ].

With the advent of computer systems and its potential, the digitization of all clinical exams and medical records in the healthcare systems has become a standard and widely adopted practice nowadays. In 2003, a division of the National Academies of Sciences, Engineering, and Medicine known as Institute of Medicine chose the term “ electronic health records ” to represent records maintained for improving the health care sector towards the benefit of patients and clinicians. Electronic health records (EHR) as defined by Murphy, Hanken and Waters are computerized medical records for patients any information relating to the past, present or future physical/mental health or condition of an individual which resides in electronic system(s) used to capture, transmit, receive, store, retrieve, link and manipulate multimedia data for the primary purpose of providing healthcare and health-related services” [ 7 ].

Electronic health records

It is important to note that the National Institutes of Health (NIH) recently announced the “All of Us” initiative ( https://allofus.nih.gov/ ) that aims to collect one million or more patients’ data such as EHR, including medical imaging, socio-behavioral, and environmental data over the next few years. EHRs have introduced many advantages for handling modern healthcare related data. Below, we describe some of the characteristic advantages of using EHRs. The first advantage of EHRs is that healthcare professionals have an improved access to the entire medical history of a patient. The information includes medical diagnoses, prescriptions, data related to known allergies, demographics, clinical narratives, and the results obtained from various laboratory tests. The recognition and treatment of medical conditions thus is time efficient due to a reduction in the lag time of previous test results. With time we have observed a significant decrease in the redundant and additional examinations, lost orders and ambiguities caused by illegible handwriting, and an improved care coordination between multiple healthcare providers. Overcoming such logistical errors has led to reduction in the number of drug allergies by reducing errors in medication dose and frequency. Healthcare professionals have also found access over web based and electronic platforms to improve their medical practices significantly using automatic reminders and prompts regarding vaccinations, abnormal laboratory results, cancer screening, and other periodic checkups. There would be a greater continuity of care and timely interventions by facilitating communication among multiple healthcare providers and patients. They can be associated to electronic authorization and immediate insurance approvals due to less paperwork. EHRs enable faster data retrieval and facilitate reporting of key healthcare quality indicators to the organizations, and also improve public health surveillance by immediate reporting of disease outbreaks. EHRs also provide relevant data regarding the quality of care for the beneficiaries of employee health insurance programs and can help control the increasing costs of health insurance benefits. Finally, EHRs can reduce or absolutely eliminate delays and confusion in the billing and claims management area. The EHRs and internet together help provide access to millions of health-related medical information critical for patient life.

Digitization of healthcare and big data

Similar to EHR, an electronic medical record (EMR) stores the standard medical and clinical data gathered from the patients. EHRs, EMRs, personal health record (PHR), medical practice management software (MPM), and many other healthcare data components collectively have the potential to improve the quality, service efficiency, and costs of healthcare along with the reduction of medical errors. The big data in healthcare includes the healthcare payer-provider data (such as EMRs, pharmacy prescription, and insurance records) along with the genomics-driven experiments (such as genotyping, gene expression data) and other data acquired from the smart web of internet of things (IoT) (Fig.  1 ). The adoption of EHRs was slow at the beginning of the 21st century however it has grown substantially after 2009 [ 7 , 8 ]. The management and usage of such healthcare data has been increasingly dependent on information technology. The development and usage of wellness monitoring devices and related software that can generate alerts and share the health related data of a patient with the respective health care providers has gained momentum, especially in establishing a real-time biomedical and health monitoring system. These devices are generating a huge amount of data that can be analyzed to provide real-time clinical or medical care [ 9 ]. The use of big data from healthcare shows promise for improving health outcomes and controlling costs.

figure 1

Workflow of Big data Analytics. Data warehouses store massive amounts of data generated from various sources. This data is processed using analytic pipelines to obtain smarter and affordable healthcare options

Big data in biomedical research

A biological system, such as a human cell, exhibits molecular and physical events of complex interplay. In order to understand interdependencies of various components and events of such a complex system, a biomedical or biological experiment usually gathers data on a smaller and/or simpler component. Consequently, it requires multiple simplified experiments to generate a wide map of a given biological phenomenon of interest. This indicates that more the data we have, the better we understand the biological processes. With this idea, modern techniques have evolved at a great pace. For instance, one can imagine the amount of data generated since the integration of efficient technologies like next-generation sequencing (NGS) and Genome wide association studies (GWAS) to decode human genetics. NGS-based data provides information at depths that were previously inaccessible and takes the experimental scenario to a completely new dimension. It has increased the resolution at which we observe or record biological events associated with specific diseases in a real time manner. The idea that large amounts of data can provide us a good amount of information that often remains unidentified or hidden in smaller experimental methods has ushered-in the ‘- omics ’ era. The ‘ omics ’ discipline has witnessed significant progress as instead of studying a single ‘ gene ’ scientists can now study the whole ‘ genome ’ of an organism in ‘ genomics ’ studies within a given amount of time. Similarly, instead of studying the expression or ‘ transcription ’ of single gene, we can now study the expression of all the genes or the entire ‘ transcriptome ’ of an organism under ‘ transcriptomics ’ studies. Each of these individual experiments generate a large amount of data with more depth of information than ever before. Yet, this depth and resolution might be insufficient to provide all the details required to explain a particular mechanism or event. Therefore, one usually finds oneself analyzing a large amount of data obtained from multiple experiments to gain novel insights. This fact is supported by a continuous rise in the number of publications regarding big data in healthcare (Fig.  2 ). Analysis of such big data from medical and healthcare systems can be of immense help in providing novel strategies for healthcare. The latest technological developments in data generation, collection and analysis, have raised expectations towards a revolution in the field of personalized medicine in near future.

figure 2

Publications associated with big data in healthcare. The numbers of publications in PubMed are plotted by year

Big data from omics studies

NGS has greatly simplified the sequencing and decreased the costs for generating whole genome sequence data. The cost of complete genome sequencing has fallen from millions to a couple of thousand dollars [ 10 ]. NGS technology has resulted in an increased volume of biomedical data that comes from genomic and transcriptomic studies. According to an estimate, the number of human genomes sequenced by 2025 could be between 100 million to 2 billion [ 11 ]. Combining the genomic and transcriptomic data with proteomic and metabolomic data can greatly enhance our knowledge about the individual profile of a patient—an approach often ascribed as “individual, personalized or precision health care”. Systematic and integrative analysis of omics data in conjugation with healthcare analytics can help design better treatment strategies towards precision and personalized medicine (Fig.  3 ). The genomics-driven experiments e.g., genotyping, gene expression, and NGS-based studies are the major source of big data in biomedical healthcare along with EMRs, pharmacy prescription information, and insurance records. Healthcare requires a strong integration of such biomedical data from various sources to provide better treatments and patient care. These prospects are so exciting that even though genomic data from patients would have many variables to be accounted, yet commercial organizations are already using human genome data to help the providers in making personalized medical decisions. This might turn out to be a game-changer in future medicine and health.

figure 3

A framework for integrating omics data and health care analytics to promote personalized treatment

Internet of Things (IOT)

Healthcare industry has not been quick enough to adapt to the big data movement compared to other industries. Therefore, big data usage in the healthcare sector is still in its infancy. For example, healthcare and biomedical big data have not yet converged to enhance healthcare data with molecular pathology. Such convergence can help unravel various mechanisms of action or other aspects of predictive biology. Therefore, to assess an individual’s health status, biomolecular and clinical datasets need to be married. One such source of clinical data in healthcare is ‘internet of things’ (IoT).

In fact, IoT is another big player implemented in a number of other industries including healthcare. Until recently, the objects of common use such as cars, watches, refrigerators and health-monitoring devices, did not usually produce or handle data and lacked internet connectivity. However, furnishing such objects with computer chips and sensors that enable data collection and transmission over internet has opened new avenues. The device technologies such as Radio Frequency IDentification (RFID) tags and readers, and Near Field Communication (NFC) devices, that can not only gather information but interact physically, are being increasingly used as the information and communication systems [ 3 ]. This enables objects with RFID or NFC to communicate and function as a web of smart things. The analysis of data collected from these chips or sensors may reveal critical information that might be beneficial in improving lifestyle, establishing measures for energy conservation, improving transportation, and healthcare. In fact, IoT has become a rising movement in the field of healthcare. IoT devices create a continuous stream of data while monitoring the health of people (or patients) which makes these devices a major contributor to big data in healthcare. Such resources can interconnect various devices to provide a reliable, effective and smart healthcare service to the elderly and patients with a chronic illness [ 12 ].

Advantages of IoT in healthcare

Using the web of IoT devices, a doctor can measure and monitor various parameters from his/her clients in their respective locations for example, home or office. Therefore, through early intervention and treatment, a patient might not need hospitalization or even visit the doctor resulting in significant cost reduction in healthcare expenses. Some examples of IoT devices used in healthcare include fitness or health-tracking wearable devices, biosensors, clinical devices for monitoring vital signs, and others types of devices or clinical instruments. Such IoT devices generate a large amount of health related data. If we can integrate this data with other existing healthcare data like EMRs or PHRs, we can predict a patients’ health status and its progression from subclinical to pathological state [ 9 ]. In fact, big data generated from IoT has been quiet advantageous in several areas in offering better investigation and predictions. On a larger scale, the data from such devices can help in personnel health monitoring, modelling the spread of a disease and finding ways to contain a particular disease outbreak.

The analysis of data from IoT would require an updated operating software because of its specific nature along with advanced hardware and software applications. We would need to manage data inflow from IoT instruments in real-time and analyze it by the minute. Associates in the healthcare system are trying to trim down the cost and ameliorate the quality of care by applying advanced analytics to both internally and externally generated data.

Mobile computing and mobile health (mHealth)

In today’s digital world, every individual seems to be obsessed to track their fitness and health statistics using the in-built pedometer of their portable and wearable devices such as, smartphones, smartwatches, fitness dashboards or tablets. With an increasingly mobile society in almost all aspects of life, the healthcare infrastructure needs remodeling to accommodate mobile devices [ 13 ]. The practice of medicine and public health using mobile devices, known as mHealth or mobile health, pervades different degrees of health care especially for chronic diseases, such as diabetes and cancer [ 14 ]. Healthcare organizations are increasingly using mobile health and wellness services for implementing novel and innovative ways to provide care and coordinate health as well as wellness. Mobile platforms can improve healthcare by accelerating interactive communication between patients and healthcare providers. In fact, Apple and Google have developed devoted platforms like Apple’s ResearchKit and Google Fit for developing research applications for fitness and health statistics [ 15 ]. These applications support seamless interaction with various consumer devices and embedded sensors for data integration. These apps help the doctors to have direct access to your overall health data. Both the user and their doctors get to know the real-time status of your body. These apps and smart devices also help by improving our wellness planning and encouraging healthy lifestyles. The users or patients can become advocates for their own health.

Nature of the big data in healthcare

EHRs can enable advanced analytics and help clinical decision-making by providing enormous data. However, a large proportion of this data is currently unstructured in nature. An unstructured data is the information that does not adhere to a pre-defined model or organizational framework. The reason for this choice may simply be that we can record it in a myriad of formats. Another reason for opting unstructured format is that often the structured input options (drop-down menus, radio buttons, and check boxes) can fall short for capturing data of complex nature. For example, we cannot record the non-standard data regarding a patient’s clinical suspicions, socioeconomic data, patient preferences, key lifestyle factors, and other related information in any other way but an unstructured format. It is difficult to group such varied, yet critical, sources of information into an intuitive or unified data format for further analysis using algorithms to understand and leverage the patients care. Nonetheless, the healthcare industry is required to utilize the full potential of these rich streams of information to enhance the patient experience. In the healthcare sector, it could materialize in terms of better management, care and low-cost treatments. We are miles away from realizing the benefits of big data in a meaningful way and harnessing the insights that come from it. In order to achieve these goals, we need to manage and analyze the big data in a systematic manner.

Management and analysis of big data

Big data is the huge amounts of a variety of data generated at a rapid rate. The data gathered from various sources is mostly required for optimizing consumer services rather than consumer consumption. This is also true for big data from the biomedical research and healthcare. The major challenge with big data is how to handle this large volume of information. To make it available for scientific community, the data is required to be stored in a file format that is easily accessible and readable for an efficient analysis. In the context of healthcare data, another major challenge is the implementation of high-end computing tools, protocols and high-end hardware in the clinical setting. Experts from diverse backgrounds including biology, information technology, statistics, and mathematics are required to work together to achieve this goal. The data collected using the sensors can be made available on a storage cloud with pre-installed software tools developed by analytic tool developers. These tools would have data mining and ML functions developed by AI experts to convert the information stored as data into knowledge. Upon implementation, it would enhance the efficiency of acquiring, storing, analyzing, and visualization of big data from healthcare. The main task is to annotate, integrate, and present this complex data in an appropriate manner for a better understanding. In absence of such relevant information, the (healthcare) data remains quite cloudy and may not lead the biomedical researchers any further. Finally, visualization tools developed by computer graphics designers can efficiently display this newly gained knowledge.

Heterogeneity of data is another challenge in big data analysis. The huge size and highly heterogeneous nature of big data in healthcare renders it relatively less informative using the conventional technologies. The most common platforms for operating the software framework that assists big data analysis are high power computing clusters accessed via grid computing infrastructures. Cloud computing is such a system that has virtualized storage technologies and provides reliable services. It offers high reliability, scalability and autonomy along with ubiquitous access, dynamic resource discovery and composability. Such platforms can act as a receiver of data from the ubiquitous sensors, as a computer to analyze and interpret the data, as well as providing the user with easy to understand web-based visualization. In IoT, the big data processing and analytics can be performed closer to data source using the services of mobile edge computing cloudlets and fog computing. Advanced algorithms are required to implement ML and AI approaches for big data analysis on computing clusters. A programming language suitable for working on big data (e.g. Python, R or other languages) could be used to write such algorithms or software. Therefore, a good knowledge of biology and IT is required to handle the big data from biomedical research. Such a combination of both the trades usually fits for bioinformaticians. The most common among various platforms used for working with big data include Hadoop and Apache Spark. We briefly introduce these platforms below.

Loading large amounts of (big) data into the memory of even the most powerful of computing clusters is not an efficient way to work with big data. Therefore, the best logical approach for analyzing huge volumes of complex big data is to distribute and process it in parallel on multiple nodes. However, the size of data is usually so large that thousands of computing machines are required to distribute and finish processing in a reasonable amount of time. When working with hundreds or thousands of nodes, one has to handle issues like how to parallelize the computation, distribute the data, and handle failures. One of most popular open-source distributed application for this purpose is Hadoop [ 16 ]. Hadoop implements MapReduce algorithm for processing and generating large datasets. MapReduce uses map and reduce primitives to map each logical record’ in the input into a set of intermediate key/value pairs, and reduce operation combines all the values that shared the same key [ 17 ]. It efficiently parallelizes the computation, handles failures, and schedules inter-machine communication across large-scale clusters of machines. Hadoop Distributed File System (HDFS) is the file system component that provides a scalable, efficient, and replica based storage of data at various nodes that form a part of a cluster [ 16 ]. Hadoop has other tools that enhance the storage and processing components therefore many large companies like Yahoo, Facebook, and others have rapidly adopted  it. Hadoop has enabled researchers to use data sets otherwise impossible to handle. Many large projects, like the determination of a correlation between the air quality data and asthma admissions, drug development using genomic and proteomic data, and other such aspects of healthcare are implementing Hadoop. Therefore, with the implementation of Hadoop system, the healthcare analytics will not be held back.

Apache Spark

Apache Spark is another open source alternative to Hadoop. It is a unified engine for distributed data processing that includes higher-level libraries for supporting SQL queries ( Spark SQL ), streaming data ( Spark Streaming ), machine learning ( MLlib ) and graph processing ( GraphX ) [ 18 ]. These libraries help in increasing developer productivity because the programming interface requires lesser coding efforts and can be seamlessly combined to create more types of complex computations. By implementing Resilient distributed Datasets (RDDs), in-memory processing of data is supported that can make Spark about 100× faster than Hadoop in multi-pass analytics (on smaller datasets) [ 19 , 20 ]. This is more true when the data size is smaller than the available memory [ 21 ]. This indicates that processing of really big data with Apache Spark would require a large amount of memory. Since, the cost of memory is higher than the hard drive, MapReduce is expected to be more cost effective for large datasets compared to Apache Spark. Similarly, Apache Storm was developed to provide a real-time framework for data stream processing. This platform supports most of the programming languages. Additionally, it offers good horizontal scalability and built-in-fault-tolerance capability for big data analysis.

Machine learning for information extraction, data analysis and predictions

In healthcare, patient data contains recorded signals for instance, electrocardiogram (ECG), images, and videos. Healthcare providers have barely managed to convert such healthcare data into EHRs. Efforts are underway to digitize patient-histories from pre-EHR era notes and supplement the standardization process by turning static images into machine-readable text. For example, optical character recognition (OCR) software is one such approach that can recognize handwriting as well as computer fonts and push digitization. Such unstructured and structured healthcare datasets have untapped wealth of information that can be harnessed using advanced AI programs to draw critical actionable insights in the context of patient care. In fact, AI has emerged as the method of choice for big data applications in medicine. This smart system has quickly found its niche in decision making process for the diagnosis of diseases. Healthcare professionals analyze such data for targeted abnormalities using appropriate ML approaches. ML can filter out structured information from such raw data.

Extracting information from EHR datasets

Emerging ML or AI based strategies are helping to refine healthcare industry’s information processing capabilities. For example, natural language processing (NLP) is a rapidly developing area of machine learning that can identify key syntactic structures in free text, help in speech recognition and extract the meaning behind a narrative. NLP tools can help generate new documents, like a clinical visit summary, or to dictate clinical notes. The unique content and complexity of clinical documentation can be challenging for many NLP developers. Nonetheless, we should be able to extract relevant information from healthcare data using such approaches as NLP.

AI has also been used to provide predictive capabilities to healthcare big data. For example, ML algorithms can convert the diagnostic system of medical images into automated decision-making. Though it is apparent that healthcare professionals may not be replaced by machines in the near future, yet AI can definitely assist physicians to make better clinical decisions or even replace human judgment in certain functional areas of healthcare.

Image analytics

Some of the most widely used imaging techniques in healthcare include computed tomography (CT), magnetic resonance imaging (MRI), X-ray, molecular imaging, ultrasound, photo-acoustic imaging, functional MRI (fMRI), positron emission tomography (PET), electroencephalography (EEG), and mammograms. These techniques capture high definition medical images (patient data) of large sizes. Healthcare professionals like radiologists, doctors and others do an excellent job in analyzing medical data in the form of these files for targeted abnormalities. However, it is also important to acknowledge the lack of specialized professionals for many diseases. In order to compensate for this dearth of professionals, efficient systems like Picture Archiving and Communication System (PACS) have been developed for storing and convenient access to medical image and reports data [ 22 ]. PACSs are popular for delivering images to local workstations, accomplished by protocols such as digital image communication in medicine (DICOM). However, data exchange with a PACS relies on using structured data to retrieve medical images. This by nature misses out on the unstructured information contained in some of the biomedical images. Moreover, it is possible to miss an additional information about a patient’s health status that is present in these images or similar data. A professional focused on diagnosing an unrelated condition might not observe it, especially when the condition is still emerging. To help in such situations, image analytics is making an impact on healthcare by actively extracting disease biomarkers from biomedical images. This approach uses ML and pattern recognition techniques to draw insights from massive volumes of clinical image data to transform the diagnosis, treatment and monitoring of patients. It focuses on enhancing the diagnostic capability of medical imaging for clinical decision-making.

A number of software tools have been developed based on functionalities such as generic, registration, segmentation, visualization, reconstruction, simulation and diffusion to perform medical image analysis in order to dig out the hidden information. For example, Visualization Toolkit is a freely available software which allows powerful processing and analysis of 3D images from medical tests [ 23 ], while SPM can process and analyze 5 different types of brain images (e.g. MRI, fMRI, PET, CT-Scan and EEG) [ 24 ]. Other software like GIMIAS, Elastix, and MITK support all types of images. Various other widely used tools and their features in this domain are listed in Table  1 . Such bioinformatics-based big data analysis may extract greater insights and value from imaging data to boost and support precision medicine projects, clinical decision support tools, and other modes of healthcare. For example, we can also use it to monitor new targeted-treatments for cancer.

Big data from omics

The big data from “omics” studies is a new kind of challenge for the bioinformaticians. Robust algorithms are required to analyze such complex data from biological systems. The ultimate goal is to convert this huge data into an informative knowledge base. The application of bioinformatics approaches to transform the biomedical and genomics data into predictive and preventive health is known as translational bioinformatics. It is at the forefront of data-driven healthcare. Various kinds of quantitative data in healthcare, for example from laboratory measurements, medication data and genomic profiles, can be combined and used to identify new meta-data that can help precision therapies [ 25 ]. This is why emerging new technologies are required to help in analyzing this digital wealth. In fact, highly ambitious multimillion-dollar projects like “ Big Data Research and Development Initiative ” have been launched that aim to enhance the quality of big data tools and techniques for a better organization, efficient access and smart analysis of big data. There are many advantages anticipated from the processing of ‘ omics’ data from large-scale Human Genome Project and other population sequencing projects. In the population sequencing projects like 1000 genomes, the researchers will have access to a marvelous amount of raw data. Similarly, Human Genome Project based Encyclopedia of DNA Elements (ENCODE) project aimed to determine all functional elements in the human genome using bioinformatics approaches. Here, we list some of the widely used bioinformatics-based tools for big data analytics on omics data.

SparkSeq is an efficient and cloud-ready platform based on Apache Spark framework and Hadoop library that is used for analyses of genomic data for interactive genomic data analysis with nucleotide precision

SAMQA identifies errors and ensures the quality of large-scale genomic data. This tool was originally built for the National Institutes of Health Cancer Genome Atlas project to identify and report errors including sequence alignment/map [SAM] format error and empty reads.

ART can simulate profiles of read errors and read lengths for data obtained using high throughput sequencing platforms including SOLiD and Illumina platforms.

DistMap is another toolkit used for distributed short-read mapping based on Hadoop cluster that aims to cover a wider range of sequencing applications. For instance, one of its applications namely the BWA mapper can perform 500 million read pairs in about 6 h, approximately 13 times faster than a conventional single-node mapper.

SeqWare is a query engine based on Apache HBase database system that enables access for large-scale whole-genome datasets by integrating genome browsers and tools.

CloudBurst is a parallel computing model utilized in genome mapping experiments to improve the scalability of reading large sequencing data.

Hydra uses the Hadoop-distributed computing framework for processing large peptide and spectra databases for proteomics datasets. This specific tool is capable of performing 27 billion peptide scorings in less than 60 min on a Hadoop cluster.

BlueSNP is an R package based on Hadoop platform used for genome-wide association studies (GWAS) analysis, primarily aiming on the statistical readouts to obtain significant associations between genotype–phenotype datasets. The efficiency of this tool is estimated to analyze 1000 phenotypes on 10 6 SNPs in 10 4 individuals in a duration of half-an-hour.

Myrna the cloud-based pipeline, provides information on the expression level differences of genes, including read alignments, data normalization, and statistical modeling.

The past few years have witnessed a tremendous increase in disease specific datasets from omics platforms. For example, the ArrayExpress Archive of Functional Genomics data repository contains information from approximately 30,000 experiments and more than one million functional assays. The growing amount of data demands for better and efficient bioinformatics driven packages to analyze and interpret the information obtained. This has also led to the birth of specific tools to analyze such massive amounts of data. Below, we mention some of the most popular commercial platforms for big data analytics.

Commercial platforms for healthcare data analytics

In order to tackle big data challenges and perform smoother analytics, various companies have implemented AI to analyze published results, textual data, and image data to obtain meaningful outcomes. IBM Corporation is one of the biggest and experienced players in this sector to provide healthcare analytics services commercially. IBM’s Watson Health is an AI platform to share and analyze health data among hospitals, providers and researchers. Similarly, Flatiron Health provides technology-oriented services in healthcare analytics specially focused in cancer research. Other big companies such as Oracle Corporation and Google Inc. are also focusing to develop cloud-based storage and distributed computing power platforms. Interestingly, in the recent few years, several companies and start-ups have also emerged to provide health care-based analytics and solutions. Some of the vendors in healthcare sector are provided in Table  2 . Below we discuss a few of these commercial solutions.

Ayasdi is one such big vendor which focuses on ML based methodologies to primarily provide machine intelligence platform along with an application framework with tried & tested enterprise scalability. It provides various applications for healthcare analytics, for example, to understand and manage clinical variation, and to transform clinical care costs. It is also capable of analyzing and managing how hospitals are organized, conversation between doctors, risk-oriented decisions by doctors for treatment, and the care they deliver to patients. It also provides an application for the assessment and management of population health, a proactive strategy that goes beyond traditional risk analysis methodologies. It uses ML intelligence for predicting future risk trajectories, identifying risk drivers, and providing solutions for best outcomes. A strategic illustration of the company’s methodology for analytics is provided in Fig.  4 .

figure 4

Illustration of application of “Intelligent Application Suite” provided by AYASDI for various analyses such as clinical variation, population health, and risk management in healthcare sector

Linguamatics

It is an NLP based algorithm that relies on an interactive text mining algorithm (I2E). I2E can extract and analyze a wide array of information. Results obtained using this technique are tenfold faster than other tools and does not require expert knowledge for data interpretation. This approach can provide information on genetic relationships and facts from unstructured data. Classical, ML requires well-curated data as input to generate clean and filtered results. However, NLP when integrated in EHR or clinical records per se facilitates the extraction of clean and structured information that often remains hidden in unstructured input data (Fig.  5 ).

figure 5

Schematic representation for the working principle of NLP-based AI system used in massive data retention and analysis in Linguamatics

This is one of the unique ideas of the tech-giant IBM that targets big data analytics in almost every professional sector. This platform utilizes ML and AI based algorithms extensively to extract the maximum information from minimal input. IBM Watson enforces the regimen of integrating a wide array of healthcare domains to provide meaningful and structured data (Fig.  6 ). In an attempt to uncover novel drug targets specifically in cancer disease model, IBM Watson and Pfizer have formed a productive collaboration to accelerate the discovery of novel immune-oncology combinations. Combining Watson’s deep learning modules integrated with AI technologies allows the researchers to interpret complex genomic data sets. IBM Watson has been used to predict specific types of cancer based on the gene expression profiles obtained from various large data sets providing signs of multiple druggable targets. IBM Watson is also used in drug discovery programs by integrating curated literature and forming network maps to provide a detailed overview of the molecular landscape in a specific disease model.

figure 6

IBM Watson in healthcare data analytics. Schematic representation of the various functional modules in IBM Watson’s big-data healthcare package. For instance, the drug discovery domain involves network of highly coordinated data acquisition and analysis within the spectrum of curating database to building meaningful pathways towards elucidating novel druggable targets

In order to analyze the diversified medical data, healthcare domain, describes analytics in four categories: descriptive, diagnostic, predictive, and prescriptive analytics. Descriptive analytics refers for describing the current medical situations and commenting on that whereas diagnostic analysis explains reasons and factors behind occurrence of certain events, for example, choosing treatment option for a patient based on clustering and decision trees. Predictive analytics focuses on predictive ability of the future outcomes by determining trends and probabilities. These methods are mainly built up of machine leaning techniques and are helpful in the context of understanding complications that a patient can develop. Prescriptive analytics is to perform analysis to propose an action towards optimal decision making. For example, decision of avoiding a given treatment to the patient based on observed side effects and predicted complications. In order to improve performance of the current medical systems integration of big data into healthcare analytics can be a major factor; however, sophisticated strategies  need to be developed. An architecture of best practices of different analytics in healthcare domain is required for integrating big data technologies to improve the outcomes. However, there are many challenges associated with the implementation of such strategies.

Challenges associated with healthcare big data

Methods for big data management and analysis are being continuously developed especially for real-time data streaming, capture, aggregation, analytics (using ML and predictive), and visualization solutions that can help integrate a better utilization of EMRs with the healthcare. For example, the EHR adoption rate of federally tested and certified EHR programs in the healthcare sector in the U.S.A. is nearly complete [ 7 ]. However, the availability of hundreds of EHR products certified by the government, each with different clinical terminologies, technical specifications, and functional capabilities has led to difficulties in the interoperability and sharing of data. Nonetheless, we can safely say that the healthcare industry has entered into a ‘post-EMR’ deployment phase. Now, the main objective is to gain actionable insights from these vast amounts of data collected as EMRs. Here, we discuss some of these challenges in brief.

Storing large volume of data is one of the primary challenges, but many organizations are comfortable with data storage on their own premises. It has several advantages like control over security, access, and up-time. However, an on-site server network can be expensive to scale and difficult to maintain. It appears that with decreasing costs and increasing reliability, the cloud-based storage using IT infrastructure is a better option which most of the healthcare organizations have opted for. Organizations must choose cloud-partners that understand the importance of healthcare-specific compliance and security issues. Additionally, cloud storage offers lower up-front costs, nimble disaster recovery, and easier expansion. Organizations can also have a hybrid approach to their data storage programs, which may be the most flexible and workable approach for providers with varying data access and storage needs.

The data needs to cleansed or scrubbed to ensure the accuracy, correctness, consistency, relevancy, and purity after acquisition. This cleaning process can be manual or automatized using logic rules to ensure high levels of accuracy and integrity. More sophisticated and precise tools use machine-learning techniques to reduce time and expenses and to stop foul data from derailing big data projects.

Unified format

Patients produce a huge volume of data that is not easy to capture with traditional EHR format, as it is knotty and not easily manageable. It is too difficult to handle big data especially when it comes without a perfect data organization to the healthcare providers. A need to codify all the clinically relevant information surfaced for the purpose of claims, billing purposes, and clinical analytics. Therefore, medical coding systems like Current Procedural Terminology (CPT) and International Classification of Diseases (ICD) code sets were developed to represent the core clinical concepts. However, these code sets have their own limitations.

Some studies have observed that the reporting of patient data into EMRs or EHRs is not entirely accurate yet [ 26 , 27 , 28 , 29 ], probably because of poor EHR utility, complex workflows, and a broken understanding of why big data is all-important to capture well. All these factors can contribute to the quality issues for big data all along its lifecycle. The EHRs intend to improve the quality and communication of data in clinical workflows though reports indicate discrepancies in these contexts. The documentation quality might improve by using self-report questionnaires from patients for their symptoms.

Image pre-processing

Studies have observed various physical factors that can lead to altered data quality and misinterpretations from existing medical records [ 30 ]. Medical images often suffer technical barriers that involve multiple types of noise and artifacts. Improper handling of medical images can also cause tampering of images for instance might lead to delineation of anatomical structures such as veins which is non-correlative with real case scenario. Reduction of noise, clearing artifacts, adjusting contrast of acquired images and image quality adjustment post mishandling are some of the measures that can be implemented to benefit the purpose.

There have been many security breaches, hackings, phishing attacks, and ransomware episodes that data security is a priority for healthcare organizations. After noticing an array of vulnerabilities, a list of technical safeguards was developed for the protected health information (PHI). These rules, termed as HIPAA Security Rules, help guide organizations with storing, transmission, authentication protocols, and controls over access, integrity, and auditing. Common security measures like using up-to-date anti-virus software, firewalls, encrypting sensitive data, and multi-factor authentication can save a lot of trouble.

To have a successful data governance plan, it would be mandatory to have complete, accurate, and up-to-date metadata regarding all the stored data. The metadata would be composed of information like time of creation, purpose and person responsible for the data, previous usage (by who, why, how, and when) for researchers and data analysts. This would allow analysts to replicate previous queries and help later scientific studies and accurate benchmarking. This increases the usefulness of data and prevents creation of “data dumpsters” of low or no use.

Metadata would make it easier for organizations to query their data and get some answers. However, in absence of proper interoperability between datasets the query tools may not access an entire repository of data. Also, different components of a dataset should be well interconnected or linked and easily accessible otherwise a complete portrait of an individual patient’s health may not be generated. Medical coding systems like ICD-10, SNOMED-CT, or LOINC must be implemented to reduce free-form concepts into a shared ontology. If the accuracy, completeness, and standardization of the data are not in question, then Structured Query Language (SQL) can be used to query large datasets and relational databases.

Visualization

A clean and engaging visualization of data with charts, heat maps, and histograms to illustrate contrasting figures and correct labeling of information to reduce potential confusion, can make it much easier for us to absorb information and use it appropriately. Other examples include bar charts, pie charts, and scatterplots with their own specific ways to convey the data.

Data sharing

Patients may or may not receive their care at multiple locations. In the former case, sharing data with other healthcare organizations would be essential. During such sharing, if the data is not interoperable then data movement between disparate organizations could be severely curtailed. This could be due to technical and organizational barriers. This may leave clinicians without key information for making decisions regarding follow-ups and treatment strategies for patients. Solutions like Fast Healthcare Interoperability Resource (FHIR) and public APIs, CommonWell (a not-for-profit trade association) and Carequality (a consensus-built, common interoperability framework) are making data interoperability and sharing easy and secure. The biggest roadblock for data sharing is the treatment of data as a commodity that can provide a competitive advantage. Therefore, sometimes both providers and vendors intentionally interfere with the flow of information to block the information flow between different EHR systems [ 31 ].

The healthcare providers will need to overcome every challenge on this list and more to develop a big data exchange ecosystem that provides trustworthy, timely, and meaningful information by connecting all members of the care continuum. Time, commitment, funding, and communication would be required before these challenges are overcome.

Big data analytics for cutting costs

To develop a healthcare system based on big data that can exchange big data and provides us with trustworthy, timely, and meaningful information, we need to overcome every challenge mentioned above. Overcoming these challenges would require investment in terms of time, funding, and commitment. However, like other technological advances, the success of these ambitious steps would apparently ease the present burdens on healthcare especially in terms of costs. It is believed that the implementation of big data analytics by healthcare organizations might lead to a saving of over 25% in annual costs in the coming years. Better diagnosis and disease predictions by big data analytics can enable cost reduction by decreasing the hospital readmission rate. The healthcare firms do not understand the variables responsible for readmissions well enough. It would be easier for healthcare organizations to improve their protocols for dealing with patients and prevent readmission by determining these relationships well. Big data analytics can also help in optimizing staffing, forecasting operating room demands, streamlining patient care, and improving the pharmaceutical supply chain. All of these factors will lead to an ultimate reduction in the healthcare costs by the organizations.

Quantum mechanics and big data analysis

Big data sets can be staggering in size. Therefore, its analysis remains daunting even with the most powerful modern computers. For most of the analysis, the bottleneck lies in the computer’s ability to access its memory and not in the processor [ 32 , 33 ]. The capacity, bandwidth or latency requirements of memory hierarchy outweigh the computational requirements so much that supercomputers are increasingly used for big data analysis [ 34 , 35 ]. An additional solution is the application of quantum approach for big data analysis.

Quantum computing and its advantages

The common digital computing uses binary digits to code for the data whereas quantum computation uses quantum bits or qubits [ 36 ]. A qubit is a quantum version of the classical binary bits that can represent a zero, a one, or any linear combination of states (called superpositions ) of those two qubit states [ 37 ]. Therefore, qubits allow computer bits to operate in three states compared to two states in the classical computation. This allows quantum computers to work thousands of times faster than regular computers. For example, a conventional analysis of a dataset with n points would require 2 n processing units whereas it would require just n quantum bits using a quantum computer. Quantum computers use quantum mechanical phenomena like superposition and quantum entanglement to perform computations [ 38 , 39 ].

Quantum algorithms can speed-up the big data analysis exponentially [ 40 ]. Some complex problems, believed to be unsolvable using conventional computing, can be solved by quantum approaches. For example, the current encryption techniques such as RSA, public-key (PK) and Data Encryption Standard (DES) which are thought to be impassable now would be irrelevant in future because quantum computers will quickly get through them [ 41 ]. Quantum approaches can dramatically reduce the information required for big data analysis. For example, quantum theory can maximize the distinguishability between a multilayer network using a minimum number of layers [ 42 ]. In addition, quantum approaches require a relatively small dataset to obtain a maximally sensitive data analysis compared to the conventional (machine-learning) techniques. Therefore, quantum approaches can drastically reduce the amount of computational power required to analyze big data. Even though, quantum computing is still in its infancy and presents many open challenges, it is being implemented for healthcare data.

Applications in big data analysis

Quantum computing is picking up and seems to be a potential solution for big data analysis. For example, identification of rare events, such as the production of Higgs bosons at the Large Hadron Collider (LHC) can now be performed using quantum approaches [ 43 ]. At LHC, huge amounts of collision data (1PB/s) is generated that needs to be filtered and analyzed. One such approach, the quantum annealing for ML (QAML) that implements a combination of ML and quantum computing with a programmable quantum annealer, helps reduce human intervention and increase the accuracy of assessing particle-collision data. In another example, the quantum support vector machine was implemented for both training and classification stages to classify new data [ 44 ]. Such quantum approaches could find applications in many areas of science [ 43 ]. Indeed, recurrent quantum neural network (RQNN) was implemented to increase signal separability in electroencephalogram (EEG) signals [ 45 ]. Similarly, quantum annealing was applied to intensity modulated radiotherapy (IMRT) beamlet intensity optimization [ 46 ]. Similarly, there exist more applications of quantum approaches regarding healthcare e.g. quantum sensors and quantum microscopes [ 47 ].

Conclusions and future prospects

Nowadays, various biomedical and healthcare tools such as genomics, mobile biometric sensors, and smartphone apps generate a big amount of data. Therefore, it is mandatory for us to know about and assess that can be achieved using this data. For example, the analysis of such data can provide further insights in terms of procedural, technical, medical and other types of improvements in healthcare. After a review of these healthcare procedures, it appears that the full potential of patient-specific medical specialty or personalized medicine is under way. The collective big data analysis of EHRs, EMRs and other medical data is continuously helping build a better prognostic framework. The companies providing service for healthcare analytics and clinical transformation are indeed contributing towards better and effective outcome. Common goals of these companies include reducing cost of analytics, developing effective Clinical Decision Support (CDS) systems, providing platforms for better treatment strategies, and identifying and preventing fraud associated with big data. Though, almost all of them face challenges on federal issues like how private data is handled, shared and kept safe. The combined pool of data from healthcare organizations and biomedical researchers have resulted in a better outlook, determination, and treatment of various diseases. This has also helped in building a better and healthier personalized healthcare framework. Modern healthcare fraternity has realized the potential of big data and therefore, have implemented big data analytics in healthcare and clinical practices. Supercomputers to quantum computers are helping in extracting meaningful information from big data in dramatically reduced time periods. With high hopes of extracting new and actionable knowledge that can improve the present status of healthcare services, researchers are plunging into biomedical big data despite the infrastructure challenges. Clinical trials, analysis of pharmacy and insurance claims together, discovery of biomarkers is a part of a novel and creative way to analyze healthcare big data.

Big data analytics leverage the gap within structured and unstructured data sources. The shift to an integrated data environment is a well-known hurdle to overcome. Interesting enough, the principle of big data heavily relies on the idea of the more the information, the more insights one can gain from this information and can make predictions for future events. It is rightfully projected by various reliable consulting firms and health care companies that the big data healthcare market is poised to grow at an exponential rate. However, in a short span we have witnessed a spectrum of analytics currently in use that have shown significant impacts on the decision making and performance of healthcare industry. The exponential growth of medical data from various domains has forced computational experts to design innovative strategies to analyze and interpret such enormous amount of data within a given timeframe. The integration of computational systems for signal processing from both research and practicing medical professionals has witnessed growth. Thus, developing a detailed model of a human body by combining physiological data and “-omics” techniques can be the next big target. This unique idea can enhance our knowledge of disease conditions and possibly help in the development of novel diagnostic tools. The continuous rise in available genomic data including inherent hidden errors from experiment and analytical practices need further attention. However, there are opportunities in each step of this extensive process to introduce systemic improvements within the healthcare research.

High volume of medical data collected across heterogeneous platforms has put a challenge to data scientists for careful integration and implementation. It is therefore suggested that revolution in healthcare is further needed to group together bioinformatics, health informatics and analytics to promote personalized and more effective treatments. Furthermore, new strategies and technologies should be developed to understand the nature (structured, semi-structured, unstructured), complexity (dimensions and attributes) and volume of the data to derive meaningful information. The greatest asset of big data lies in its limitless possibilities. The birth and integration of big data within the past few years has brought substantial advancements in the health care sector ranging from medical data management to drug discovery programs for complex human diseases including cancer and neurodegenerative disorders. To quote a simple example supporting the stated idea, since the late 2000′s the healthcare market has witnessed advancements in the EHR system in the context of data collection, management and usability. We believe that big data will add-on and bolster the existing pipeline of healthcare advances instead of replacing skilled manpower, subject knowledge experts and intellectuals, a notion argued by many. One can clearly see the transitions of health care market from a wider volume base to personalized or individual specific domain. Therefore, it is essential for technologists and professionals to understand this evolving situation. In the coming year it can be projected that big data analytics will march towards a predictive system. This would mean prediction of futuristic outcomes in an individual’s health state based on current or existing data (such as EHR-based and Omics-based). Similarly, it can also be presumed that structured information obtained from a certain geography might lead to generation of population health information. Taken together, big data will facilitate healthcare by introducing prediction of epidemics (in relation to population health), providing early warnings of disease conditions, and helping in the discovery of novel biomarkers and intelligent therapeutic intervention strategies for an improved quality of life.

Availability of data and materials

Not applicable.

Laney D. 3D data management: controlling data volume, velocity, and variety, Application delivery strategies. Stamford: META Group Inc; 2001.

Google Scholar  

Mauro AD, Greco M, Grimaldi M. A formal definition of big data based on its essential features. Libr Rev. 2016;65(3):122–35.

Article   Google Scholar  

Gubbi J, et al. Internet of Things (IoT): a vision, architectural elements, and future directions. Future Gener Comput Syst. 2013;29(7):1645–60.

Doyle-Lindrud S. The evolution of the electronic health record. Clin J Oncol Nurs. 2015;19(2):153–4.

Gillum RF. From papyrus to the electronic tablet: a brief history of the clinical medical record with lessons for the digital Age. Am J Med. 2013;126(10):853–7.

Reiser SJ. The clinical record in medicine part 1: learning from cases*. Ann Intern Med. 1991;114(10):902–7.

Reisman M. EHRs: the challenge of making electronic data usable and interoperable. Pharm Ther. 2017;42(9):572–5.

Murphy G, Hanken MA, Waters K. Electronic health records: changing the vision. Philadelphia: Saunders W B Co; 1999. p. 627.

Shameer K, et al. Translational bioinformatics in the era of real-time biomedical, health care and wellness data streams. Brief Bioinform. 2017;18(1):105–24.

Service, R.F. The race for the $1000 genome. Science. 2006;311(5767):1544–6.

Stephens ZD, et al. Big data: astronomical or genomical? PLoS Biol. 2015;13(7):e1002195.

Yin Y, et al. The internet of things in healthcare: an overview. J Ind Inf Integr. 2016;1:3–13.

Moore SK. Unhooking medicine [wireless networking]. IEEE Spectr 2001; 38(1): 107–8, 110.

MathSciNet   Google Scholar  

Nasi G, Cucciniello M, Guerrazzi C. The role of mobile technologies in health care processes: the case of cancer supportive care. J Med Internet Res. 2015;17(2):e26.

Apple, ResearchKit/ResearchKit: ResearchKit 1.5.3. 2017.

Shvachko K, et al. The hadoop distributed file system. In: Proceedings of the 2010 IEEE 26th symposium on mass storage systems and technologies (MSST). New York: IEEE Computer Society; 2010. p. 1–10.

Dean J, Ghemawat S. MapReduce: simplified data processing on large clusters. Commun ACM. 2008;51(1):107–13.

Zaharia M, et al. Apache Spark: a unified engine for big data processing. Commun ACM. 2016;59(11):56–65.

Gopalani S, Arora R. Comparing Apache Spark and Map Reduce with performance analysis using K-means; 2015.

Ahmed H, et al. Performance comparison of spark clusters configured conventionally and a cloud servicE. Procedia Comput Sci. 2016;82:99–106.

Saouabi M, Ezzati A. A comparative between hadoop mapreduce and apache Spark on HDFS. In: Proceedings of the 1st international conference on internet of things and machine learning. Liverpool: ACM; 2017. p. 1–4.

Strickland NH. PACS (picture archiving and communication systems): filmless radiology. Arch Dis Child. 2000;83(1):82–6.

Article   MathSciNet   Google Scholar  

Schroeder W, Martin K, Lorensen B. The visualization toolkit. 4th ed. Clifton Park: Kitware; 2006.

Friston K, et al. Statistical parametric mapping. London: Academic Press; 2007. p. vii.

Li L, et al. Identification of type 2 diabetes subgroups through topological analysis of patient similarity. Sci Transl Med. 2015;7(311):311ra174.

Valikodath NG, et al. Agreement of ocular symptom reporting between patient-reported outcomes and medical records. JAMA Ophthalmol. 2017;135(3):225–31.

Fromme EK, et al. How accurate is clinician reporting of chemotherapy adverse effects? A comparison with patient-reported symptoms from the Quality-of-Life Questionnaire C30. J Clin Oncol. 2004;22(17):3485–90.

Beckles GL, et al. Agreement between self-reports and medical records was only fair in a cross-sectional study of performance of annual eye examinations among adults with diabetes in managed care. Med Care. 2007;45(9):876–83.

Echaiz JF, et al. Low correlation between self-report and medical record documentation of urinary tract infection symptoms. Am J Infect Control. 2015;43(9):983–6.

Belle A, et al. Big data analytics in healthcare. Biomed Res Int. 2015;2015:370194.

Adler-Milstein J, Pfeifer E. Information blocking: is it occurring and what policy strategies can address it? Milbank Q. 2017;95(1):117–35.

Or-Bach, Z. A 1,000x improvement in computer systems by bridging the processor-memory gap. In: 2017 IEEE SOI-3D-subthreshold microelectronics technology unified conference (S3S). 2017.

Mahapatra NR, Venkatrao B. The processor-memory bottleneck: problems and solutions. XRDS. 1999;5(3es):2.

Voronin AA, Panchenko VY, Zheltikov AM. Supercomputations and big-data analysis in strong-field ultrafast optical physics: filamentation of high-peak-power ultrashort laser pulses. Laser Phys Lett. 2016;13(6):065403.

Dollas, A. Big data processing with FPGA supercomputers: opportunities and challenges. In: 2014 IEEE computer society annual symposium on VLSI; 2014.

Saffman M. Quantum computing with atomic qubits and Rydberg interactions: progress and challenges. J Phys B: At Mol Opt Phys. 2016;49(20):202001.

Nielsen MA, Chuang IL. Quantum computation and quantum information. 10th anniversary ed. Cambridge: Cambridge University Press; 2011. p. 708.

Raychev N. Quantum computing models for algebraic applications. Int J Scientific Eng Res. 2015;6(8):1281–8.

Harrow A. Why now is the right time to study quantum computing. XRDS. 2012;18(3):32–7.

Lloyd S, Garnerone S, Zanardi P. Quantum algorithms for topological and geometric analysis of data. Nat Commun. 2016;7:10138.

Buchanan W, Woodward A. Will quantum computers be the end of public key encryption? J Cyber Secur Technol. 2017;1(1):1–22.

De Domenico M, et al. Structural reducibility of multilayer networks. Nat Commun. 2015;6:6864.

Mott A, et al. Solving a Higgs optimization problem with quantum annealing for machine learning. Nature. 2017;550:375.

Rebentrost P, Mohseni M, Lloyd S. Quantum support vector machine for big data classification. Phys Rev Lett. 2014;113(13):130503.

Gandhi V, et al. Quantum neural network-based EEG filtering for a brain-computer interface. IEEE Trans Neural Netw Learn Syst. 2014;25(2):278–88.

Nazareth DP, Spaans JD. First application of quantum annealing to IMRT beamlet intensity optimization. Phys Med Biol. 2015;60(10):4137–48.

Reardon S. Quantum microscope offers MRI for molecules. Nature. 2017;543(7644):162.

Download references

Acknowledgements

Author information.

Sabyasachi Dash and Sushil Kumar Shakyawar contributed equally to this work

Authors and Affiliations

Department of Pathology and Laboratory Medicine, Weill Cornell Medicine, New York, 10065, NY, USA

Sabyasachi Dash

Center of Biological Engineering, University of Minho, Campus de Gualtar, 4710-057, Braga, Portugal

Sushil Kumar Shakyawar

SilicoLife Lda, Rua do Canastreiro 15, 4715-387, Braga, Portugal

Postgraduate School for Molecular Medicine, Warszawskiego Uniwersytetu Medycznego, Warsaw, Poland

Mohit Sharma

Małopolska Centre for Biotechnology, Jagiellonian University, Kraków, Poland

3B’s Research Group, Headquarters of the European Institute of Excellence on Tissue Engineering and Regenerative Medicine, AvePark - Parque de Ciência e Tecnologia, Zona Industrial da Gandra, Barco, 4805-017, Guimarães, Portugal

Sandeep Kaushik

You can also search for this author in PubMed   Google Scholar

Contributions

MS wrote the manuscript. SD and SKS further added significant discussion that highly improved the quality of manuscript. SK designed the content sequence, guided SD, SS and MS in writing and revising the manuscript and checked the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sandeep Kaushik .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Dash, S., Shakyawar, S.K., Sharma, M. et al. Big data in healthcare: management, analysis and future prospects. J Big Data 6 , 54 (2019). https://doi.org/10.1186/s40537-019-0217-0

Download citation

Received : 17 January 2019

Accepted : 06 June 2019

Published : 19 June 2019

DOI : https://doi.org/10.1186/s40537-019-0217-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Biomedical research
  • Big data analytics
  • Internet of things
  • Personalized medicine
  • Quantum computing

ibm healthcare case studies

Challenges in Commercial Deployment of AI: Insights from The Rise and Fall of IBM Watson’s AI Medical System

When IBM set about commercializing its artificial intelligence-driven Watson AI in the healthcare market, its early successes were widely publicized. Senior managers and the media claimed that its diagnostic features would soon surpass those of the sharpest doctors. The case describes the large gap between what was promised and what happened in practice, offering insider insights on why IBM’s projects failed. As the corporate commitment to AI escalated in response to successful lab results, cognitive dissonance arose between managers’ expectations and what they could actually deliver. How could that have happened? Three reasons for Watson’s downfall are explored: 1) The tendency for societal expectations to exceed the actual technical capabilities, leading to a gap in perception between AI in the lab and AI in the field. 2) Overselling of the economic benefits of AI by the salesforce; 3) Failure to secure the cooperation of key stakeholders, notably doctors who were asked to improve the performance of AI but were undermined by claims that AI could outperform them.

After reading and analyzing the case, students/participants will be able to: - Explore how organizations deal with technological uncertainty when seeking to innovate - Understand what drove the escalation of IBM’s commitment to and investment in AI - Identify the roots of cognitive dissonance and stakeholders’ resistance to technological adoption

  • Artificial Intelligence
  • Digital Technologies
  • Strategy Implementation
  • Strategy Execution
  • Commercial Success
  • Escalating Commitment
  • Cognitive Dissonance
  • Stakeholder Management
  • Overselling

Huy

Quy Huy

Timo vuori, tero ojanpera.

Duke

Lisa Simone Duke

Recommended cases.

Nokia: The Inside Story of the Rise and Fall of a Technology Giant

popular

Reference 6160

Published 26 Sep 2016

Length 14 page(s)

Topic Leadership & Organisations

Region Global

Industry Telecommunications

Culture and Leadership at IBM

award

Reference 5239

Published 01 Jan 2004

Length 19 page(s)

Turing Pharmaceuticals: Fair Profit or Price Gouging in the Drug Industry?

Reference 6253

Published 29 May 2017

Length 20 page(s)

Topic Responsibility

Region North America

Industry Pharmaceuticals

Recently Viewed

Board Process Simulation (A)

By   Stanislav Shekshnia

Birkenstock: Exit the Family. Enter a Professional CEO

By   Morten Bennedsen ,  Mark Stabile ,  Brian Henry

new

Rasurel: Reviving an Ageing Brand

By   Amitava Chattopadhyay ,  Séverine de Wulf

Get to know our comprehensive Cybersecurity Portfolio: Learn More

  • Overview of Private Equity IT
  • IT Due Diligence
  • IT Integration for M&A
  • IT Integration for Carve-Outs
  • Cybersecurity Services Portfolio
  • Cybersecurity Assessment
  • Cybersecurity Managed Services
  • Compliance Services Portfolio
  • Microsoft 365 Services Portfolio
  • Microsoft 365 Assessment
  • Microsoft 365 Roadmap
  • Microsoft 365 Managed Services
  • Azure Services Portfolio
  • Azure Assessment
  • Azure Roadmap
  • Azure Managed Services
  • Company Overview
  • Company Details
  • Case Studies
  • Overview Of Private Equity IT
  • IT Integrations For M&A
  • IT Integrations For Carve-Outs

Fully managed IT solutions to take your Private Equity portfolio companies to the next level.

Assess critical IT functions for a 360-degree overview of IT costs, risks, value creation opportunities, and an accurate Total Cost of Ownership (TCO) understanding!

Overcome integration challenges after M&A with dedicated IT support that brings together systems, businesses processes, financial infrastructures, and people.

Leverage expert methodologies, experience, and knowledge that has powered thousands of financial transactions, mitigating risks at every carve-out stage.

  • Cybersecurity Strategy and Roadmap
  • Cybersecurity Roadmap
  • IT Infrastructure & Operations Assessment
  • IT Infrastructure & Operations Strategy and Roadmap
  • IT Infrastructure & Operations Managed Services

Creating business value by protecting critical IT systems through entire data lifecycles.

Find unchecked security threats that might be harming your business, empowering your IT infrastructure towards a robust defence framework.

Get past typical cybersecurity roadblocks, protecting critical organizational data and aligning your cybersecurity with strategic goals with a tangible implementation roadmap!

Strengthen your IT security defences with industry-leading technology and expertise that protects your assets and meets even the most challenging 24/7 monitoring demands.

  • Compliance Audit Assessment
  • Compliance Audit Roadmap
  • Compliance Audit Managed Services

Our complete service portfolio for Compliance services from Assessment to Remediation and Monitoring. Our Team will keep your organization in line with your selected framework.

Obtain a clear and actionable current state of your organization's audit readiness for SOC 2, ISO 27001 and UK Cyber Essentials.

Develop a short and long term strategy-level plan for remediating your organization people, processess and tools to obtain a sustainable certification in the shortest time possible.

Our Team's offering that covers our award winning compliance Automation Platform, Consulting, remediation and complete Audit Certification Services.

Get access to a full-service suite offering that maximizes collaboration between teams, delivers best service guidance, and improves the ROI of your cloud licensing investment.

Find unchecked security threats and discover the true maturity and utilization state of your Microsoft 365 environment.

Develop a long-term strategy-level plan for Microsoft 365 initiatives, establishing a clear blueprint of how internal company processes and your tenant implementation should be improved.

Future-proof your enterprise on the cloud and streamline your collaboration and security by fully optimizing the utilization of each Microsoft 365 Suite component.

Optimize your Azure implementation and operations, reduce your operational expenses, and improve your end customer experience with managed IT services for your Azure infrastructure.

Gain comprehensive understanding of the current state of your Azure environment, identify infrastructure gaps, scope for process risks, and align your tech with business outcomes.

Develop a long-term strategy-level plan for Azure initiatives, having a clear blueprint of how internal company processes and systems will be affected.

Shift your focus to higher-level business activities by reducing operational burdens, building sustainable operational advantages, and boosting overall IT service levels.

Conozca nuestro completo portafolio de ciberseguridad: Aprenda más

  • Generalidades TI Banca de Inversión
  • Debida Diligencia de TI
  • Integración de TI para M&A
  • Integración de TI para Escisiones Empresariales
  • Generalidades Ciberseguridad
  • Evaluación de Ciberseguridad
  • Plan de Ruta de Ciberseguridad
  • Servicios Gestionados de Ciberseguridad
  • Cumplimiento
  • Portafolio de Microsoft 365
  • Evaluación de Microsoft 365
  • Plan de Ruta de Microsoft 365
  • Servicios Gestionados de Microsoft 365
  • Portafolio de Azure
  • Evaluación de Azure
  • Plan de Ruta de Azure
  • Servicios Gestionados de Azure
  • Sobre Nosotros
  • Casos de Estudio
  • Aprenda más

Soluciones de TI completamente gestionadas para llevar a sus empresas de portafolio de Banca de Inversión al siguiente nivel.

Evalúe funciones críticas de TI para obtener una visión de 360 grados de costos de IT, riesgos, oportunidades de creación de valor y entender con precisión el costo total de propiedad (TCO).

Supere desafíos de integración después de M&A con soporte de TI dedicado que une sistemas, procesos empresariales, infraestructuras financieras y personas.

Aproveche metodologías expertas, experiencia y conocimientoque ha impulsado a cientos de transacciones financieras, mitigando riesgos en cada paso de una escisión empresarial.

  • Portafolio de Ciberseguridad
  • Evaluación de Infraestructura y Operaciones de TI
  • Plan de Ruta de Infraestructura y Operaciones de TI
  • Servicios gestionados de Infraestructura y Operaciones de TI

Cree valor protegiendo sistemas de TI críticos a través de todo su ciclo de vida de sus datos.

Encuentre amenazas de seguridad no detectadas que le podrían estar haciendo daño a su empresa, fortaleciendo su infraestructura de TI hacia un robusto marco de referencia en defensa.

Supere obstáculos usuales de ciberseguridad, protegiendo datos organizacionales críticos y alineando su ciberseguridad y metas estratégicas con un plan de ruta tangible.

Fortalezca sus defensas de seguridad de TI con tecnologías líderes en la industria y experticia que protege sus activos y satisface las demandas de monitoreo 24/7 más desafiantes.

  • Portafolio de servicios de Cumplimiento
  • Evaluación de Cumplimiento
  • Hoja de ruta de Cumplimiento
  • Servicios gestionados de Cumplimiento

Nuestro completo portafolio de Servicios de Cumplimiento, desde Evaluación hasta Remediación y Monitoreo. Nuestro equipo mantendrá su organización en línea con el marco seleccionado.

Obtenga un estado actual claro y accionable de la preparación para auditorías de su organización en SOC 2, ISO 27001 y UK Cyber Essentials, entre otros.

Desarrollamos un plan estratégico a corto y mediano plazo para remediar en su organización: equipo, procesos y herramientas con el fin de obtener una certificación sostenible en el menor tiempo posible.

La oferta de nuestro equipo que cubre nuestra galardonada plataforma de automatización de cumplimiento, consultoría, remediación y servicios completos de certificación.

Provea experiencias de nube computacional consistentes con Azure Hybrid Cloud Express en sus instalaciones en la nube. Integre ambientes de TI aislados con soluciones integradas de nube computacional.

  • Por qué Azure Hybrid Cloud Express
  • Casos de Éxito
  • Por qué ne Digital
  • Preguntas Frecuentes

Aproveche la Nube Computacional Híbrida más poderosa para SMEs con un clúster de VMware de tamaño ideal preconfigurado y listo para usar.

  • Por qué Hybrid Cloud Express en IBM
  • Componentes Clave
  • Complementos
  • Resumen de Servicio

Obtenga accesos a una suite de servicio completo de Microsoft, que maximiza colaboración entre equipos, entrega la mejor guía de servicio y mejora el ROI de su inversión en la nube computacional.

Encuentre riesgos de seguridad previamente ocultos y descubra la real madurez de su organización en la utilización de los servicios de su tenant.

Desarrollo una estrategia a largo plazo para sus iniciativas de Microsoft 365, estableciento un plan claro de como sus procesos internos y la implementación en su tenant deben ser mejorados

Garantice el futuro de su organización en la nube y haga más eficiente su colaboración y seguridad, optimizando la utilización de cada servicio de la suite de Microsoft 365.

Optimice su operación e implementación de Azure, reduzca sus costos operacionales y mejore la experiencia de sus usuarios finales con servicios gestionados de IT para su infraestructura Azure.

Obtenga un entendimiento completo del estado actual de su entorno Azure, identifique vacíos de infraestructura, determine el alcance de los riesgos y alinee su tecnología con los resultados de su negocio.

Desarrollo una estrategia a largo plazo para sus iniciativas de Azure, obteniendo un plan claro de como sus procesos y sistemas seran impactados.

Focalice su atención en actividades de negocio de alto nivel, reduciendo sus cargas operacionales, construyendo ventajas operativas sostenibles y dándole un empuje a sus niveles globales de servicios de TI.

Case Study: Hybrid Cloud for Health care in IBM Cloud

Industry: Healthcare

Hybrid Cloud for Health care in IBM Cloud

  • Challenges & Solution

Learn more

Company : GELA

GELA – Grupo Empresarial las Américas is a leading group of enterprises in the healthcare industry with operations in both Florida and Latin America. The organization operates 9 independent companies with a CAP and AABB accredited medical laboratory, an oncology institute, a high complexity clinic and an outpatient services clinic.

Critical Need

To establish a VMware hybrid cloud that allows increasing the capacity, availability, and performance of their IT infrastructure while providing fast and efficient online health services to patients and treating physicians over the globe. Thus, enabling GELA with a disaster recovery and high availability solution that could eventually handle all their workloads in the future In this solution a SAN backed storage was a requirement in order to provide full backup services for both on-premises and public cloud environments as well as to offer vMotion and easy VM migrations.

Challenges Faced

Challenge 1: networking architecture.

Organizing and establishing a well-defined vlan and sub network infrastructure on premises that could be easily and efficiently scaled and incorporated into a public cloud.

Challenge 2: Sizing of workloads

Establishing adequate sizing of workloads in order to determine which workloads stayed on-premise and which workloads were more adequate to be migrated to the public cloud.

Challenge 3: Security and patient information privacy

GELA patients needed to be guaranteed privacy and seamless connectivity between their private and public clouds under every possible work protocol.

Solution Provided

We at ne Digital, an IBM Advanced Business Partner, built an elastic hybrid cloud based on VMware vSphere that seamlessly expands GELA’s local data center into two IBM Cloud data centers. This empowered their IT team to control all workloads, clusters, hosts, and virtual machines from a single vCenter with complete flexibility and high availability.

Implementation Strategy

Infrastructure.

This solution makes use of Bare Metal machines for ESXi hypervisors, OS Nexus SAN storage system and a Brocade vRouter Firewall to ensure that VPN connectivity between data centers is established and maintained in a secure fashion with complete granular control.

A complete set of VLANs was configured to segment traffic for management, redundant storage, public traffic, and a three-tier environment for database, application servers and web servers.

Connectivity

The on-premise networking is comprised of IBM and Juniper switching and routing and an IPSEC VPN tunnel was established between a Juniper SRX Cluster and the Brocade vRouter in IBM Cloud.

Tech Stack Deployed

  • IBM Cloud Bare Metal servers.
  • IBM Cloud Virtual Instances.
  • OS Nexus Virtual Appliance.
  • Brocade vRouter.

Success Data

Reduction in cost for VMware Enterprise Plus ESXi Licensing

Increased availability of services

Reduced required personnel to manage public cloud infrastructure (FTE)

A single platform for on-premise and on-cloud workloads

Become the Next Success Story!

Schedule a Call

Subscribe To Our Newsletter

All The Latest Industry and Organizational Updates. Delivered right to your inbox.

We are an IT consultation powerhouse that designs and operates for private equity asset value creation. We run secure, IT mission-critical workloads for Hybrid Cloud with expert architects and certified engineers in Microsoft Azure Cloud and IBM Cloud.

Get in touch for our cybersecurity assessment report, assessment, and optimization services.

All Rights Reserved ne Digital, Inc.® and ne Graphics, Inc.®

  • Privacy Policy

AICPA SOC 2

Somos una organización de consultoría de TI que diseña y opera para la creación de valor. Manejamos cargas de trabajo críticas para nubes computacionales híbridas con arquitectos expertos e ingenieros certificados en Microsoft Azure e IBM Cloud.

Póngase en contacto y solicite nuestros servicios.

Todos los derechos reservados ne Digital, Inc.® y ne Graphics, Inc.®

  • Generalidades
  • Política de privacidad

SOC_NonCPA

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Med Internet Res
  • v.22(8); 2020 Aug

Logo of jmir

Blockchain in Health Care Innovation: Literature Review and Case Study From a Business Ecosystem Perspective

Shuchih ernest chang.

1 Graduate Institute of Technology Management, National Chung Hsing University, Taichung, Taiwan

YiChian Chen

Blockchain technology is leveraging its innovative potential in various sectors and its transformation of business-related processes has drawn much attention. Topics of research interest have focused on medical and health care applications, while research implications have generally concluded in system design, literature reviews, and case studies. However, a general overview and knowledge about the impact on the health care ecosystem is limited.

This paper explores a potential paradigm shift and ecosystem evolution in health care utilizing blockchain technology.

A literature review with a case study on a pioneering initiative was conducted. With a systematic life cycle analysis, this study sheds light on the evolutionary development of blockchain in health care scenarios and its interactive relationship among stakeholders.

Four stages—birth, expansion, leadership, and self-renewal or death—in the life cycle of the business ecosystem were explored to elucidate the evolving trajectories of blockchain-based health care implementation. Focused impacts on the traditional health care industry are highlighted within each stage to further support the potential health care paradigm shift in the future.

Conclusions

This paper enriches the existing body of literature in this field by illustrating the potential of blockchain in fulfilling stakeholders’ needs and elucidating the phenomenon of coevolution within the health care ecosystem. Blockchain not only catalyzes the interactions among players but also facilitates the formation of the ecosystem life cycle. The collaborative network linked by blockchain may play a critical role on value creation, transfer, and sharing among the health care community. Future efforts may focus on empirical or case studies to validate the proposed evolution of the health care ecosystem.

Introduction

In the last decade, blockchain technology has gained growing attention from both academia and practitioners in a range of industries, including banking, insurance, trade, and medicine. Blockchain has potential in various industries, including in financial applications, supply chains [ 1 ], the insurance industry [ 2 ], and even medical health care records [ 3 - 5 ]. Through maintaining an immutable, tamper-proof, consecutive list of transactional data in a distributed network, blockchain has created several disruptions in incumbent business processes with its unique features. Having a promising capability to improve information flow, sharing, and transmission among participating nodes (ie, partners in the real system), blockchain is expected to transform legacy operations with innovative service delivery and ownership transfer [ 6 ]. Blockchain adoption and pioneer pilots in different sectors have shown its power in transforming traditional working paradigms.

Blockchain, as a kind of distributed ledger technology, enables data storage, sharing, and verification under a distributed peer-to-peer network [ 7 ]. Participating nodes (ie, entities) may cooperatively maintain the common shared ledger by contributing efforts to data verification via cryptography. Blockchain can be viewed as a consecutive list of transactions that are chronologically appended to the previous ones. Updates of any part need to be verified and then recorded on the chain. This process is achieved by participating nodes’ contributions to solving the cryptographical puzzle, which in turn increases the difficulty of malicious tampering and alterations. In this sense, all transactions are visible and immutable for all parties, thus providing audit trails and data integrity. In addition, its affiliated technology, smart contracts, can be deployed on blockchain-based platforms to activate or enforce specific desired processes. Smart contracts are computer protocols that aim to execute terms of a contract or agreements [ 8 ]. In real practice, smart contracts can be coded with computer languages to interact with one another and be triggered by events in the real world [ 9 ]. These attributes, when deployed on blockchain system, may facilitate business logic and process automation.

Recent publications, including technical reports, research articles [ 10 , 11 ], and consulting papers [ 12 ], have addressed blockchain’s potential to reshape the complex operations in the field of health care. Blockchain applications in the realm of health care may be promising; however, the compositions and interactions among major health care stakeholders, such as patients, care service providers, pharmacies, funders and insurers, medical device suppliers, and research organizations, are rather complex (see Figure 1 ). Extant research topics on how these stakeholders may achieve benefits by the use of blockchain technology have been addressed from the perspective of a single industry. Comprehensive discussions on the development and potential evolution of blockchain-based health care have been discussed less. It is noted that activities and interactions among stakeholders may have crossed a variety of industries. As Moore [ 13 ] has suggested, a careful systematic approach to business strategy needs to consider firms in the scope of a larger ecosystem rather than a member of a single industry. To better elucidate the evolution of a health care ecosystem utilizing blockchain innovation, stakeholders must address cooperative and competitive issues when attempting to deliver tangible and intangible values to meet customer needs.

An external file that holds a picture, illustration, etc.
Object name is jmir_v22i8e19480_fig1.jpg

Typical health care ecosystem.

Through unique distributed schemes and immutable shared ledgers, blockchain allows better transparency, security, privacy, traceability, and trust-free environment among players [ 14 ]. This implies that blockchain connects not only individual siloed databases via decentralized governance but also the ecosystem surrounding health care stakeholders. However, this may lead to more complex supply-and-demand relationships and interactions among actors who operate their businesses in an original centralized manner. Therefore, this study attempts to shed light on driving inertia from a business ecosystem perspective rather than through a traditional supply chain vision. Moore [ 15 ] defined the business ecosystem as an economic community loosely connected by a group of interacting organizations and individuals who share common values and who coevolve with one another. Researchers also extended this argument by addressing cross-industry collaboration rather than disparate interactions among directly connected counterparties [ 16 , 17 ]. This concept provides broader visions when blockchain interplays, connects, and disintermediates the dynamic relationships among connected medical communities, service providers, and end customers.

However, there is very limited research on blockchain-based health care ecosystems in the extant literature. Previous research efforts on blockchain mainly focused on technological potential [ 18 ], individual applications [ 19 ], medical record accessibility [ 20 ], and general influence. Others discussed the proof-of-concept of system design [ 21 , 22 ], adoption attitudes [ 23 ], governance, challenges, and opportunities in future research [ 24 - 26 ]. Few extant articles in the literature have addressed dynamic relationships among medical stakeholders with an overview of the blockchain ecosystem. Therefore, this research aims to investigate how blockchain can lead to a coevolving health care ecosystem by collating overviews of potential evolutions of blockchain-enabled health care applications from recent literature from a perspective of the business ecosystem. In this study, we address two research questions:

  • Research question 1: What kind of potential effects from recent innovations and applications make use of blockchain in the health care industry?
  • Research question 2: How do health care stakeholders participate, interact, and evolve in the blockchain-based ecosystem and how do they collaboratively contribute to a potential paradigm shift?

To shed light on blockchain’s influence on value creation and capture of medical stakeholders, we examine and address these research questions from a perspective of the business ecosystem, with an aim to contribute to the body of knowledge in health care.

Existing Service Process and Blockchain Roles

Traditionally, medical information is located at disconnected databases in clinics, labs, or medical institutions. Aggregating health data from disparate sources and gaining a holistic view of patient treatment history have been difficult and costly. As blockchain can store transaction logs among participants, better transparency and completeness of treatment history could be achieved. Blockchain may drive the digital transformation of legacy information sharing [ 27 ]. Traditional paper-based processes and manual processing could be reduced and better interoperability among disconnected health systems is feasible. In addition, traditional medical supply chains have suffered from poor traceability and invisible provenance. Blockchain may provide solutions to improve transparency and real-time monitoring from manufacturing to delivery. Other focused areas also include secure identity management [ 28 ], audit and governance, and facilitation for medical research (see Figure 2 ).

An external file that holds a picture, illustration, etc.
Object name is jmir_v22i8e19480_fig2.jpg

Blockchain’s role in improving the health care service system. AR: augmented reality; EHR: electronic health record; EMR: electronic medical record; IoT: internet of things.

Literature Review

To answer research question 1, we conducted a literature survey to find the current state and potential of blockchain applications in the health care field. Other than using a systematic approach, we focused on specific applications that may be enabled by blockchain to transform the interaction and manipulation of a health care ecosystem. Some review articles in the recently published literature were also selected to help understand the potential evolution among health care stakeholders.

Figure 3 illustrates the search and review process of the focused literature. We searched for blockchain studies in medical and health care fields and conducted subsequent article screening and identification; abstract and text reviewing were conducted to select focused literature. The numbers in the flowchart boxes in Figure 3 denote articles that were available after the respective procedural steps. From the ecosystem perspective, the literature selection and extraction criteria paid attention to the capabilities that relevant studies highlight and that elucidate essential ingredients for constructing blockchain-based ecosystem partnerships. Some review articles were added to give a general overview of blockchain-based health care studies. Sampled articles were extracted from the filtered corpus to highlight focused topics, such as data management, information sharing, access control, security, and privacy.

An external file that holds a picture, illustration, etc.
Object name is jmir_v22i8e19480_fig3.jpg

The procedural framework for the focused literature review. MEDLINE: Medical Literature Analysis and Retrieval System Online.

In this study, recent blockchain-based health care projects were examined to shed light on the disruption to health care practice. The case study, a qualitative method, refers to a systematic analysis of a specific target from a wide perspective and enables comprehension throughout the exploration process [ 29 ]. Applying this approach, researchers suggested that consideration of research objectives, contexts, and representativeness needs to be stressed [ 30 , 31 ], followed by inclusion of selection protocols suggested by previous literature articles, secondary resources from news archives, consulting reports, company websites, and academic articles [ 32 ]. Case study results were then collated to answer research question 2 and to elucidate the understanding of cooperative and competitive strategies and operational business schemes in the health care context. We selected the IBM blockchain–health care initiative [ 33 ] as the body of the target case and combined the concept of the business ecosystem with the health care context to analyze the interactions, cooperative or competitive, among species (ie, ecosystem members). Furthermore, major players’ roles and influences in the blockchain-based health care ecosystem were analyzed to give research implications.

Business Ecosystem Perspective

This study analyzed the potential evolutionary path of blockchain-based health care innovation from a business ecosystem perspective. Moore [ 13 ] proposed the life cycle of a business ecosystem and divided it into four stages: birth, expansion, leadership, and self-renewal or death. We identified four major development stages within which health care stakeholders interact with each other and evolve chronologically with their roles and cooperative and competitive challenges. Iansiti and Levien [ 34 , 35 ] extended Moore’s concepts by defining the roles of actors and argued that these roles were formed by large, loosely connected networks of entities. They further classified three categories of the actors’ roles as keystone, dominator, and niche player. The business ecosystem is comprised of diverse participants across various industries. The overall health of the ecosystem depends on the positive interactions and operations among stakeholders.

Birth: Pioneering

During this stage, entrepreneurs focus on the value creation or proposition that meets customers’ needs. The product or service needs to be presented in its best form to draw potential customers’ attention and effectively deliver its value. Leaders in the ecosystem aggregate suitable suppliers to take part in the environment and attempt to incorporate business partners’ capabilities to optimize the value package to customers.

The ecosystem grows and expands its territories. The business ecosystem faces competition to increase market share against its rivals. Firms may devote a large number of endeavors to marketing activities for increasing sales. Meanwhile, to improve overall performance, issues regarding large-scale adoption and distribution are stressed. In this stage, while incomplete ecosystems are likely to be expelled from competition, superior ones may integrate community members to complete sound supply chains, thus achieving ecosystem stability. Required conditions in the expansion stage include value-oriented business concepts and the corresponding potential to broaden scalability.

Leadership: Authority

Following the expansion, the leader or integrator needs to guide the direction of investment and technology standards. As innovation is a crucial factor for evolving ecosystems, stakeholders may find their positions and revenue models through the leader’s guidance. While the bargaining power of suppliers increases during this stage, the system integrator needs to enhance the supply chain management with alternative options to assure the stability of production and distribution. How firms constantly create values to maintain their importance in the ecosystem is critical to the overall health and continued improvement of the ecosystem.

Self-Renewal or Death

This stage occurs when firms face external threats, for example, changes in regulations or the rise of new ecosystems and innovations such as emerging technologies. Original business communities may undergo different levels of change and fluctuation. The altered environment may challenge the survival of original members. How leaders detect the potential changes and new incoming elements and threats, thereby correspondingly reacting to these alterations, may decide the future outcome of the ecosystem. When facing obsolescence, either self-renewal by incorporating new innovative ideas or stepping toward death depends on the capabilities to enable system transformations.

Reviewed Literature

The pursuit of building a sustainable and healthy ecosystem is essential for participating stakeholders. When the requirements for building a health care ecosystem are considered, we found issues that are being addressed by extant research studies. We selected a number of articles to elucidate the recent research foci. Table 1 summarizes several related articles regarding blockchain in health care; these articles were published in academically rigorous peer-reviewed scientific journals. Focused topics in the blockchain–health care ecosystem are briefly collated in the following sections.

Overview of blockchain-based health care applications in the research literature.

Decentralized Storage and Medical Data Management

Centralization of health data from disparate sources has long been a major pain point for further medical usage. Generally, disconnected data sources could be utilized to increase the integration and aggregation of medical data. Based on the distributed nature of blockchain, researchers have stressed that there are data storage and management issues in clinical trials [ 22 ], insurance [ 2 ], and personal health scenarios [ 36 ].

Information Sharing and Identity Management

Based on the immutable and distributed features of blockchain, a common shared ledger may facilitate health information exchange (HIE). Some proof-of-concept studies have covered the potential and major contributions to these topics; for example, Ali et al [ 37 ] focused on remote health monitoring, Hau et al [ 23 ] surveyed stakeholders’ attitudes, and Esmaeilzadeh and Mirzaei [ 18 ] conducted an experimental study to understand patients’ perceptions of various exchange mechanisms. In addition, while several researchers conducted literature reviews to shed light on potential strengths and limitations of blockchain applications [ 38 , 39 ], others reviewed potential identity management solutions [ 28 ] and developed evaluation frameworks for assessing performance of blockchain initiatives [ 40 ].

Access Control, Security, and Privacy

As access control and authentication are major security requirements for managing health care and medical data, researchers have proposed some blockchain-based prototypes to provide solutions to current health systems [ 41 , 42 ]. Digitization of electronic medical records (EMRs) may introduce cyberattack risks on data security and privacy when stakeholders, such as providers, payers, and researchers, attempt to interact with patient data. Blockchain-enabled solutions may maintain patient-sensitive data through a friendly approach [ 43 - 45 ].

Case Study of the IBM Blockchain–Health Care Initiative

On January 24, 2019, IBM announced its collaborative blockchain initiative with major health care players, including Aetna (acquired by pharmacy and health plan provider CVS Health), Anthem (health plan provider), Health Care Service Corporation (the largest customer-owned health insurance provider in the United States), and PNC Bank [ 46 ]. IBM has been searching for new opportunities by leveraging the potential of blockchain and attempting to build up a special networked health care ecosystem. In the last few months, health organizations, health care providers, start-ups, and technology companies joined in this initiative to grow the Health Utility Network, of which Cigna and Sentara Healthcare are participants. The aim is to drive digital transformation by providing better transparency and interoperability. Participants may reap benefits from building, sharing, and deploying solutions to incumbent challenges in the health care context. Major issues and potential blockchain use cases are enumerated as follows:

  • Provenance and traceability of pharmaceutical supply chain: fake and counterfeit drugs could be troublesome and dangerous issues as drug provenance is difficult to track in a cross-border setting. A large number of handovers from manufacturers, shippers, distributors, retailers, and pharmacies may cause inaccuracies and disputes in medical delivery operations. Counterfeit drugs with improper ingredients and dosages may jeopardize the health of patients and even cause legal disputes among manufacturers, suppliers, and customers. With immutable, tamper-proof, and trackable characteristics, blockchain may provide solutions to authenticity and traceability of transferred assets along with auditable and secure transaction records among stakeholders. For example, in a private drug blockchain, drug registration by pharmaceutical companies may grant a higher level of trustworthiness and authentic proof. Also, these companies, acting as dominators, could assign the roles of the actors; some of them may have the rights for registration while others may conduct verification of transactions. The provenance of drugs can be assured via verification processes with related manufacturing or identity information when appended on-chain, making it easy to be tracked.
  • Data management during clinical trials: when clinical trials are implemented, numerous data are produced by different devices via the operation of medical staff. How these data are stored, transmitted, shared, and utilized for medical therapy or operations is critical to existing manual systems. Errors and fraud during clinical trials operations could be generated via malicious alterations or unintentional mistakes. Typical flaws could occur when trial procedures are inaccurately designed by biased intentions from actors or inconsistent records and responses from patients’ evolutionary medical reports. Blockchain in this case could provide proof-of-existence for any form of documentation. The information needs to be verified via the consent of the participating nodes and not under a single entity’s control. Modifying or changing information would be cryptographically difficult to conduct among a majority of network players, thus making documentation highly trusted.
  • EMR and electronic health record (EHR) management: where patient or medical records are concerned, a challenge is that individual medical data are not easily accessed by different medical institutions or clinics. While the medical information is stored disparately in various databases or systems, it is difficult to deliver proper medication and care service in a personalized context. Sensitive data can also hinder the transmission efficiency among medical organizations. How to access, share, and utilize a holistic medical treatment history in a secure way remains a challenging issue in centralized EMR systems. However, with the help of distributed ledger technology, blockchain may have potential regarding the manipulation and access control of such EHR and EMR systems. Blockchain platforms can be combined with existing EHR and EMR systems, either in the cloud computing environment or otherwise, through the use of Oracle and data gateways. Patients can share their medical records, with or without permission, to registered users or stakeholders on a medical blockchain. Patients may decide the level of information disclosure through smart contract settings to specific users, thus receiving rewards from the blockchain system, accordingly. As described above, blockchain could facilitate the sharing and management of EHRs and EMRs among supply and demand entities. Related data analysis and rewards from sharing could potentially promote the participation of the medical community and, consequently, leverage a network effect.

In health care, major inefficiencies can arise from clinical operations, administrative processing, and frictions among disparate systems. These pain points have decreased the overall performance and have led to poor customer experiences in regard to incumbent medical and health care systems and services. The act of incorporating major players through blockchain-based systems and services in health care may help to develop a healthy, open-networked, and collaborative ecosystem. The blockchain-enabled collaboration aims to address the aforementioned challenges by pursuing reduced administrative error, mitigated system frictions, streamlined claims and payment transactions, and efficient information exchange. Iansiti and Levien expanded Moore’s ecosystem view and proposed the strategies that firms might adopt to position themselves in the business ecosystem. The strategic roles include keystone, niche player, and physical dominator. The keystone in the business ecosystem provides a platform to which niche players add value and build offerings. Niche players account for the bulk proportion of the ecosystem and are responsible for value creation and innovation. The physical dominator directly controls the majority of a network via horizontal or vertical integration. In an IBM blockchain ecosystem, the major players’ roles and corresponding functions are shown in Table 2 and are summarized as follows:

Major players’ roles and influences in a blockchain-based health care ecosystem.

  • IBM: keystone—blockchain platform provider and coordinator.
  • Aetna of CVS: niche player—improves data accuracy and optimization of health care system operation.
  • Anthem: niche player—medical information exchange.
  • Health Care Service Corporation: physical dominator—reduces information fragmentation and improves claims procedures and health care system connection.
  • PNC Bank: niche player—facilitates payment transactions and supports medical finance.

Embracing of blockchain technology is not the privilege of this initiative only. Competitors making similar efforts, such as Change Healthcare, Hashed Health, Guardtime, Gem, and SimplyVital Health, have also teamed up to launch a blockchain pilot—Intelligent Healthcare Network with Blockchain Processes—in the realm of health care. Other competing projects with a more-or-less different focus have also led to consortia competition. Prominent examples include Synaptic Health Alliance, targeting provider directories and data reconciliation, and ProCredEx, focusing on storage and sharing of medical credentials. PNC Bank, acting as a partner of interdisciplinary alliance, stands in a public position and contributes its edge to facilitate transactions among patients, payers, and providers in both domestic and cross-border contexts.

Business Ecosystem With Evolutionary Life Cycle

Blockchain, as an emerging technological innovation, has provided opportunities for incumbent health care stakeholders. As for the IBM case, a collaboration of health care partners has resulted in a new ecosystem. Its potential evolutionary stages have formed a business ecosystem lens; these stages are summarized in Table 3 .

The evolutionary path of a blockchain–health care ecosystem: the IBM case.

At the birth stage, the IBM blockchain–health care pilot faces consortia competitions from other allies. Even though the focused markets might be slightly different from pilot to pilot, similar efforts and common objectives for driving digital transformation in the health care industry are the same. IBM, as a recognized leading enterprise blockchain provider, possesses an advantageous edge against competitors. When entering into the expansion stage, the key focus is to bring new innovations to market to increase the market share. This could be carried out by optimizing platform functionality, absorbing complementary health care members, and addressing the changing demands for customers. In addition, to outperform rival ecosystems, it is essential to build up technical or industrial standards in terms of competitive strategy [ 47 ]. During the leadership stage, the leading ecosystem may focus on future prospects for followers. This could be implemented by compelling suppliers and customers to complete sound visions; for example, integration with other disruptive technologies, such as machine learning, artificial intelligence, mobile and ubiquitous health, wearables, and internet of things (IoT). Inversely, to prevent pressure from increased bargaining power, actions such as using backward integration, searching multiple suppliers, increasing profile, and conducting market education are needed. At the last stage, the blockchain–health care ecosystem may step toward self-renewal or death . This may depend highly on capabilities that the existing ecosystem may possess; it can either innovate or be replaced with alternative ecosystems or paradigms.

Comparative Analysis of the Existing System and the Future Ecosystem

Blockchain applications in the health sector have been receiving increased attention and prospects. We have summarized the current health care service pain points and highlighted the potential of blockchain in reshaping traditional practice and operations. Researchers have conducted literature reviews to report on the current challenges [ 48 , 49 ]. The major issues with the corresponding potential effects of blockchain are listed in Table 4 .

Health care service pain points and the potential effects of blockchain in the health care ecosystem.

Blockchain Impacts and the Changing Paradigm on the Health Care Ecosystem

This study collated blockchain-related literature in the health care industry. While many research efforts highlighted the potential effects of blockchain from a viewpoint of a single firm or industry, we attempted to shed light on its power from a more holistic manner, which focuses on the inclusive health care ecosystem. This changing and evolving paradigm may go through complicated cooperative and competitive challenges with the participating stakeholders. Therefore, from the illustrative case—the IBM blockchain–health care initiative—we elucidated and discussed the potential impacts and complex interactions during the lifecycle of component species or players. Five critical issues, when coevolving with blockchain adoption, are discussed to provide implications for researchers and practitioners.

Health Information Exchange With Interoperability and Integrity

HIE has long been a critical issue when data interoperability is considered; only with an effective information exchange scheme could the true value of health care information be unleashed [ 50 ]. Recently, a proliferation of publications and pilots have addressed the issue of medical records and health records. A decentralized scheme using a commonly shared ledger for information sharing offers innovators opportunities to disrupt traditional practice [ 51 ]. Health care data has granted blockchain-enabled applications great penetration points into the health care industry. Blockchain-enabled health care information exchange may unleash the power of blockchain to reduce frictions among siloed databases as well the costs from intermediaries [ 12 ]. To facilitate information exchange among disparate data systems across individual organizations, the transmission protocols or standards need to be addressed to provide data integrity. In so doing, an important part is the integration of transmission protocols, which mitigates effects of potential missing information and avoids incompatible situations. In addition, blockchain’s distributed framework may support cross-system health information usage. However, due to current technological limitations in designing blockchain applications, limited block size could become an issue for extended scalability. Therefore, only critical transactions will be appended on-chain and supporting data access schemes will be necessary for data manipulation. While blockchain could allow interoperability among health systems, incentives for individual stakeholders may become essential when creating beneficial models and supporting sustainable ecosystems are considered. In this regard, blockchain may unlock the true value of interoperability and achieve a higher level of disintermediation.

Digital Identity Management

Traditional identity management has been subject to the limitations of a centralized mechanism, such as security, privacy, and scalability. Centralized identity management is vulnerable to malicious attacks and alterations, thus being prone to theft, counterfeit, and fraud risks [ 28 ]. In addition, credentials required to request registration or access to health care services are also prone to misuse or to causing privacy disclosure. Distributed identity management may provide solutions to these limitations with its capabilities of ensuring data integrity and information sharing across different health care systems if deployed on an immutable and distributed network. The distributed model may also solve the duplicate and multi-version identity issues in health care use cases. Due to these features, identity owners may have full control of their unique digital identities and, in turn, enjoy benefits as the stakeholders in a valuable health care ecosystem. This implies that users have become the owners of their health data without the intermediation supported by traditional identity management systems. A higher degree of freedom to access, release, or share medical records has become possible. The blockchain-enabled digital identity is also useful for managing health care supply chain activities, such as the ownership transfer of specific assets. After all, as health care data are normally sensitive and confidential in nature, blockchain identity may leverage its characteristics to grant better security and privacy by reducing manual intervention and operational failures.

Health Care Supply Chain Management

Blockchain’s immutable and tamper-proof attributes have granted disruptive innovation to supply chain management. In a health care ecosystem, records of goods, such as drugs, and service flows could be recorded on-chain to provide better logistics visibility and timeliness. The integration of blockchain and medical IoT devices may be the next evolution of blockchain technology in the realm of supply chain management. A large amount of medical data generated by medical devices may be stored across different stakeholder systems. With the aid of blockchain, patient-generated data can be stored off-chain but accessed with permissions preset by blockchain-based smart contracts. In this regard, HIE can become more streamlined without intermediation. Another blockchain use case is for the drug or pharmaceutical supply chain. Typical pain points may occur during handovers across stakeholders. Blockchain provides better transparency on supply chain activities and players may have better control over product and service flows. Moreover, primary concerns also come from the provenance of drug supply. Serious fake and counterfeit drugs have prevailed due to poor authentication and traceability from manufacturing and shipping to delivery. The movement of drugs could be recorded on blockchain to provide better real-time monitoring as well as to cease the distribution of fake drugs. This implies that trackable footprints verified by participating players can help secure drug supply chains.

Medical Research and Data Exploitation

Medical records have long been managed with a centralized approach. However, the disconnected health systems that exist across different clinics or health organizations may hinder further usage of EHRs and EMRs for medical researchers [ 51 ]. A considerable number of medical records are stored in paper-based documents or in electronic health systems with poor interoperability. Poor efficiency in health care information exchange and rising costs of administrative processing have locked the true value of medical information. In traditional circumstances, researchers may have difficulties in acquiring patient data and medical records. This phenomenon may result from where and how questions. To address data sharing and exploitation among parties and research institutes, researchers have proposed a privacy-preserving model [ 52 , 53 ] and an incentive mechanism [ 54 ] during the course of data collection, sharing, and collaborative exploitation. With a shared health care ledger system, researchers may reap benefits from the blockchain-changing paradigm. They may access related data by checking smart contract conditions if the use is permitted by patients. Patients could get rewards or credits from the contributions or payments from researchers by granting different levels of permission, which are coded and stored by smart contracts on blockchain, to release specific data. In sum, blockchain may give control of data access to patients, and researchers could pay for access. In this regard, the traditional pain points for collecting patient data could be resolved in order to facilitate research conduct. Data reconciliation during research design and clinical trials may become easier with a shared medical ledger, thus improving health care and medical treatment.

Automation of Financial Transactions and Insurance Procedures

A lack of trust between health care stakeholders may affect the overall performance of financial transactions in the health care industry, for example, impedance in promoting alternative payment models between payers and providers. When the current reimbursement models and claim procedures were examined, we found hindrances on processing efficiency, transparency, and visibility among ecosystem members. For example, in current insurance fields, multiple middlemen and intermediaries exist throughout the procedures of health insurance policies. In addition, shared information could help insurers seek out better providers and provide verification on the fact if providers meet obligations and contractual terms. Smart contracts may replace efforts on drafting complex and value-based paper contracts and may automate the process of execution of terms or agreements. Through the aid of smart contracts, entities may set up logical process flows when preset conditions regarding health care activities are met. The deployment of smart contracts on decentralized immutable ledger systems could also make payment and claims records visible and render postaudit and review. In this sense, the paramount manipulation on data exchanges and payment transfer between insurers and their stakeholders could become easier and less expensive.

Limitations

In this study, we conducted a literature review to investigate the potential impacts of blockchain-based health care innovations. Along with selected pilot cases, we discussed the positions and promises that blockchain may bring to the health care ecosystem. While researchers and practitioners have high hopes, challenges will be faced before the large-scale adoption of blockchain due to limitations from technical health care service operations and regulatory concerns. Confined by the level of blockchain maturity in various health care subsectors, different use cases and clinical trials need more support from empirical work to report on its real performance. We collated extant research efforts and attempted to shed light on a potential paradigm shift in the future health care ecosystem. Such an endeavor may be subject to uncertainties from the changing environment, technology limitations, or emerging innovations.

This study aims to answer questions on the evolution and development of blockchain technology in health care research and on how stakeholders coevolve in this environment. From the perspective of the business ecosystem, we identified research articles about blockchain-enabled health care and we covered prototype designs and leading pilot cases in recent years. The evolutionary trajectory and interactions among major health care stakeholders may potentially formulate the blockchain-based health care ecosystem. Key players have presented their roles and interacted with one another to go through the life cycle of the business ecosystem. We illustrated their potential and the phenomenon of coevolution within the health care ecosystem. It is noted that while the literature in this field has proliferated recently, mostly regarding proof-of-concept studies, framework propositions, and trial pilots, a careful consideration on embracing such technology still needs to address technical limitations, privacy, mindset, and legal concerns. Our perspective and analysis show that large-scale adoption would need long-term support from health care stakeholders. Future research may devote more efforts to building up evaluation models to provide practical implications for practitioners. Whether feasible business models may sustainably survive in such an ecosystem needs attention from scholars. With a better understanding of how stakeholders coevolve within the ecosystem, players may reap their benefits in a more efficient manner to propel a potential blockchain–health care paradigm shift.

Acknowledgments

This research was supported by the Ministry of Science and Technology, Taiwan, under contract numbers MOST-106-2221-E-005-053-MY3 and MOST-109-2221-E-005-043.

Abbreviations

Conflicts of Interest: None declared.

Popular Search Resources for

Interprofessional Case Study Event Elevates Patient Care Strategies

By Office of the President | Apr 2, 2024

Interprofessional Case study

The Sixth Annual Interprofessional Case Study event, organized by Downstate's School of Health Professions (SOHP) in collaboration with the College of Nursing and the School of Public Health, took place on March 21. Engaging over 160 students in-person and virtually, the event underscored the importance of healthcare professionals working collaboratively in interdisciplinary teams to improve patient care.

Case-based learning constitutes a crucial component of health education, equipping students with essential clinical skills, fostering active learning, and enriching their understanding of the complexities involved in patient care. Interprofessional education amplifies this by instructing students on how to operate effectively within intricate healthcare settings. The amalgamation of these methodologies provides a comprehensive educational journey to emerge as proficient, collaborative, and patient-focused professionals.

Case Study

Students from various programs within the School of Health Professions, including Diagnostic Medical Imaging, Health Informatics, Midwifery, Occupational Therapy, Physical Therapy, and Physician Assistant programs, alongside College of Nursing and School of Public Health students, actively contributed insights from their respective disciplines. They collaborated on treatment plans and discussed strategies to address complex health issues, advocate for patients, and identify supportive resources. Faculty from Health Professions, Nursing, and Public Health guided students through this immersive experience.

Allen Lewis

ibm healthcare case studies

Write My Essay Service Helps You Succeed!

Being a legit essay service requires giving customers a personalized approach and quality assistance. We take pride in our flexible pricing system which allows you to get a personalized piece for cheap and in time for your deadlines. Moreover, we adhere to your specific requirements and craft your work from scratch. No plagiarized content ever exits our professional writing service as we care. about our reputation. Want to receive good grades hassle-free and still have free time? Just shoot us a "help me with essay" request and we'll get straight to work.

ibm healthcare case studies

Transparency through our essay writing service

Transparency is unique to our company and for my writing essay services. You will get to know everything about 'my order' that you have placed. If you want to check the continuity of the order and how the overall essay is being made, you can simply ask for 'my draft' done so far through your 'my account' section. To make changes in your work, you can simply pass on your revision to the writers via the online customer support chat. After getting ‘my’ initial draft in hand, you can go for unlimited revisions for free, in case you are not satisfied with any content of the draft. We will be constantly there by your side and will provide you with every kind of assistance with our best essay writing service.

ibm healthcare case studies

  • Share full article

Advertisement

Supported by

Taxpayers Were Overcharged for Patient Meds. Then Came the Lawyers.

A group of politically connected lawyers teamed up to go after insurers and made millions from one of the largest Medicaid settlements in history.

An office tower with the words “Centene Plaza” on the exterior.

By Shalina Chatlani

Shalina Chatlani examined the health care system in Mississippi as a part of The Times’s Local Investigations Fellowship .

In 2018, when Mike DeWine was Ohio’s attorney general, he began investigating an obscure corner of the health care industry.

He believed that insurers were inflating prescription drug prices through management companies that operated as middlemen in the drug supply chain. There were concerns that these companies, known as pharmacy benefit managers, or P.B.M.s, were fleecing agencies like Medicaid, the government-run health insurance program for the poor.

Three years later, after Mr. DeWine became governor of Ohio, the state announced an $88 million settlement with one of the nation’s largest insurance companies, Centene.

The case led to a nationwide reckoning for the company, as attorneys general in one state after another followed Ohio’s lead, announcing multimillion-dollar settlements and claiming credit for forcing Centene to reform its billing practices.

On the surface, it appeared that these settlements, which now total nearly $1 billion, were driven by state governments cracking down on a company that had ripped off taxpayers.

But a New York Times investigation, drawing on thousands of pages of court documents, emails and other public records in multiple states, reveals that the case against Centene was conceived and executed by a group of powerful private lawyers who used their political connections to go after millions of dollars in contingency fees.

The lawyers were first hired in Ohio, without competitive bidding. Then, they gathered evidence against Centene of questionable billing practices across the country.

Using information they acquired from Centene and other sources, they negotiated with the company to set the basic framework of an agreement that could be applied in other states. With that in hand, they approached attorneys general in multiple states and made a compelling offer: hire them, at no direct cost to taxpayers, and recoup millions of dollars Centene had already set aside.

So far, the lawyers have been awarded at least $108 million in fees.

The Centene case is just one example in a thriving industry that allows private lawyers to partner with elected attorneys general and temporarily gain powers usually reserved for the government. Under the banner of their state partners, these lawyers sue corporations and help set public policy while collecting millions of dollars in fees, usually based on a percentage of whatever money they recoup. The practice has become standard fare in the oversight of major industries, shifting the work of accountability away from legislators and regulators to the opaque world of private litigation.

Private lawyers do not have to publicly defend the deals they make or prove how aggressively they went after a company accused of wrongdoing. Nearly all their work happens in secret, especially if companies settle before the stage of a lawsuit when evidence is filed with the court.

The lawyers do not even have to disclose who worked on a case or who was paid, so the public may be left unable to monitor potential conflicts of interest even as the lawyers pursue litigation on behalf of the people.

The Centene case was organized by the Mississippi-based law firm Liston & Deas along with at least three other firms, several with close ties to former Gov. Haley Barbour of Mississippi, who was once considered one of the most influential Republican power brokers in the nation.

The lawyers included Paul Hurst, who served as Mr. Barbour’s chief of staff when he was governor and who married into Mr. Barbour’s family, and David H. Nutt, one of the richest men in Mississippi, who amassed a fortune funding state lawsuits against tobacco companies. Cohen Milstein, a huge national law firm with deep experience in contingency work for state attorneys general, was also part of the venture.

Though he is not listed in any government contracts as a lawyer of record, Mr. Barbour himself was a member of the legal team when Liston & Deas vied for the contract in Ohio.

At the time, Mr. Barbour also worked for Centene as a federal lobbyist .

Even now, close to three years after Centene signed its first settlements, no one has fully explained Mr. Barbour’s role in the case for the company. There is no way for the public to know whether he influenced the outcome or to measure whether Centene paid its full share, because the data used to calculate what Centene overcharged remains hidden from the public under provisions designed to protect attorney work product.

Mr. Barbour and other lawyers said that the former governor worked on the case for less than a year when the group was examining several insurance companies, and that he cut ties when Centene emerged as the primary target. Mr. Barbour said he informed Centene and his colleagues about the development and was never involved in negotiations or legal matters. He continued representing Centene as a lobbyist, he said, but his role in the case on behalf of the company was as “more of an observer.”

The lawyers said that Mr. Barbour was never paid for his work and that the settlement was not influenced by Mr. Barbour’s connections to Centene or to the lawyers who remained. They said each state attorney general reviewed Centene’s billing practices when deciding whether to enter a settlement agreement.

Got a confidential news tip?

The New York Times would like to hear from readers who want to share messages and materials with our journalists.

In recent years, P.B.M.s have been widely criticized , including by members of Congress, who have held multiple hearings and proposed legislation. The Centene settlements stand as the most successful attempt to hold a company operating in the industry accountable.

Liston & Deas and its partner law firms uncovered that Centene had arranged discounts with CVS Caremark on certain drugs and then pocketed the savings instead of passing them on to Medicaid. In some states, they revealed that Centene layered on unnecessary management fees that it had not disclosed. Although Centene settled without admitting guilt, the company agreed to be more transparent in how it sets reimbursement rates.

The lawyers noted that they spent several years investigating Centene and negotiating with the company at their own risk, saving states the cost of building a case.

Mr. Nutt, one of the lawyers who pursued the case, said states were happy with the terms of the settlements.

“Almost every one of those states audited to determine if our damage model was fair,” Mr. Nutt said.

“The formula was based on a triple damages model that we developed. And everybody was quite satisfied with it, because it was three times what anybody could have proven in court.”

Hiring Outside Counsel

For most of their history, state attorneys general were largely focused on advising state officials on legal matters and representing local agencies in court.

That changed drastically almost 30 years ago, when states came together to sue tobacco companies and won a $206 billion settlement to cover the cost of medical care related to smoking. The lawsuit helped redefine the role of the attorney general as one of the most powerful positions in state government and a natural place to start a political career.

Through high-profile lawsuits against corporations, an attorney general could directly affect policy and build a reputation as a champion of the people.

But complex litigation against large companies can require years of investigation and legal work, with no guarantee of success. Increasingly, states have turned to private lawyers willing to work on contingency as a way to stretch limited resources.

The rise of contingency fee cases kicked off a new wave of lobbying across the nation. Law firms looking for contracts have poured money into attorney general election campaigns and sponsored conferences at high-priced resorts, where private lawyers mingle with attorneys general and pitch their latest ideas for lawsuits.

Many states have capped how much lawyers can be paid in contingency fees and have increased oversight of private firms working for the government. But there remains concern about undue political influence and potential conflicts of interest.

“In theory, there’s an incentive to have the settlement be as big as possible, and of course that’s great for the state,” said Paul Nolette, a professor at Marquette University who has studied how the role of attorneys general has changed over time.

But in reality, lawyers have an incentive to recover the largest amount of money in the shortest amount of time, which could pressure them to water down settlements and compromise on punitive measures, Dr. Nolette said.

“I think that does raise some questions about how forcefully A.G.s and private attorneys are prosecuting a particular case,” he said.

Several experts said that contingency cases had recouped billions of dollars on behalf of the public and had become a critical way to regulate the behavior of powerful industries and large corporations.

But inviting private lawyers to help set public policy has inherent risks, they said.

Private lawyers may be more likely to have conflicts of interest because they generally represent many businesses and individuals, not just the citizens of a state.

And unlike most attorneys general, private lawyers are not elected officials. They are not generally governed by open records laws or subject to public pressure, as from legislators setting their budgets.

In the Centene case, Mr. Barbour’s associations with both Centene and the private lawyers raise “important questions” about who controlled the case to make sure it was pursued in the best interests of states that settled, said Kathleen Clark, a professor of legal ethics at Washington University in St. Louis.

“Did state A.G.s proactively pursue these cases, or did they passively accept the ‘free money’ or ‘easy money’ of the proposed settlements that the law firms had already negotiated with Centene?” Ms. Clark asked.

Christina Saler, a partner at Cohen Milstein, said Mr. Barbour’s early association with the legal team was not a conflict of interest because Mr. Barbour withdrew from the case before lawyers started investigating Centene.

“After Mr. Barbour’s disassociation, we had no further contact with Mr. Barbour on this matter,” she said.

A Well-Connected Team

Mr. Barbour’s involvement in the Ohio case against P.B.M.s illustrates the potential for favoritism when states hire private lawyers.

Mr. Hurst noted the involvement of Mr. Barbour when seeking the contract in Ohio, according to emails acquired from the Ohio attorney general’s office through a public records request.

In a June 22, 2018, email exchange, just a few days before the state hired Liston & Deas, Mr. Hurst recalled meeting with the attorney general’s staff in Ohio.

Mr. Hurst went on to note that members of his team had worked with Governor Barbour while he was in office and that they all “continue to work together now.”

In an email a week later, an assistant attorney general shared Mr. Barbour’s cell number with Mr. DeWine, saying that Mr. Barbour had shared it so he could “call him about this case anytime.”

Mr. Barbour, who had served two terms as governor of Mississippi, was a former chairman of the Republican Governors Association and a former chairman of the Republican National Committee. Known as a prolific fund-raiser , he was credited with bringing in hundreds of millions of dollars to support Republican candidates across the nation.

In 1991, Mr. Barbour co-founded BGR Group, a lobbying firm that quickly became one of the most influential in Washington.

Mr. Barbour had known Mr. DeWine since he was first elected to the Senate in 1995.

Two decades later, when Mr. DeWine was in the midst of a hard-fought campaign for governor, Mr. Barbour’s close associates solicited him for the legal work on the Centene case. In October 2018, less than three months after Mr. DeWine hired Liston & Deas, he traveled to Washington to visit Mr. Barbour’s lobbying firm for several hours, according to calendar records.

At the time, Mr. Barbour and others at BGR were registered lobbyists for Centene.

Mr. Barbour has never been named in state contracts as one of the private lawyers on the case in Ohio or anywhere else. His involvement has rarely, if ever, been publicly reported.

Ms. Saler, of Cohen Milstein, said there was no need to inform state officials because Mr. Barbour had not been involved in the Centene portion of the case and had exited the venture several years before states hired the lawyers.

At least four law firms were involved in the case in two or more states, according to retainer agreements and financial records showing broadly how settlement funds were disbursed.

According to Max Littman, a former data analyst with HealthPlan Data Solutions, the analytics firm that helped identify Centene’s overcharges in Ohio, one important role for many of the lawyers was to use their connections as they presented the overcharges to various states.

Mr. Littman, who said he worked closely with the legal team, described the dynamic: Liston & Deas, with roots in a deeply red state, would approach Republican attorneys general, and Cohen Milstein, “who were our Democrats,” would focus on Democratic states.

When The Times asked for records showing Liston & Deas’s qualifications to be hired to represent the State of Ohio, the attorney general’s office said no records existed. Cohen Milstein and other law firms had submitted such documentation in the past when seeking contracts in Ohio.

Settling With States

In June 2021, nearly three years after Ohio hired its outside counsel, two states announced the first settlements with Centene on the same day: Ohio would get $88 million, Mississippi $55 million.

After that, Centene settled in one state after another, often with just months between announcements.

In fact, Centene had already set aside $1.1 billion to handle all subsequent cases. The company estimated the amount after early discussions with the private lawyers that did not involve the state attorneys general who would later work with them.

With a settlement in hand and an estimate of how much each state could collect, the private lawyers had a powerful pitch. The team also had the option to file whistle-blower lawsuits, which can advance without a state attorney general’s having to hire outside counsel.

The team pursued whistle-blower lawsuits in Texas, California and Washington.

In Texas, the whistle-blower lawsuit came with a benefit for Attorney General Ken Paxton: Under Texas law , his office is allowed to recoup “reasonable attorney’s fees” for work associated with such cases. It collected nearly $25 million in legal fees on the Centene case while spending just 561 hours on it, financial records show. That comes out to more than $44,000 per hour of work. The Texas attorney general’s office declined to comment.

Ms. Saler said all the state attorneys general decided their own strategies in reaching settlements with Centene based on the best interest of taxpayers in their states.

In states that hired the lawyers on contingency, the attorney general closely reviewed Centene’s billing practices. But no state has revealed whether its own overcharge calculations matched those of the private lawyers.

State officials who hired Liston & Deas and the other firms knew that the lawyers had previously negotiated with Centene. But in a vast majority of states, officials did not explicitly address that fact when talking publicly about the settlements.

In addition, Liston & Deas and most of the states the firm worked for have not revealed exactly how much Centene overcharged for drugs or how settlement amounts were calculated. A few states have offered sparse descriptions, which vary widely.

The New Hampshire attorney general’s office wrote in its settlement announcement that Centene’s activities had a “$2.4 million negative financial impact.” Centene agreed to pay the state nearly 10 times that amount.

The attorney general’s office in Washington, one of the few states where officials agreed to discuss basic details about the settlement with The Times, said the $33 million it recovered amounted to treble damages.

A news release from the California attorney general’s office said the state recovered double its damages, for a total settlement of more than $215 million.

As of last month, Centene had settled in at least 19 states. The Liston & Deas website says Centene will ultimately pay about $1.25 billion to 22 states.

A Sweetheart Deal?

Some observers believe Centene would have faced stricter penalties if the federal government had taken up the case instead of private lawyers hopscotching from one state to the next.

Several experts in health care fraud litigation and whistle-blower cases said the best way to recoup money for taxpayers would have been to file a federal whistle-blower case, similar to what the lawyers did in state court in Texas and California.

A federal case could have triggered the involvement of the Justice Department, which might have investigated Centene more thoroughly. And a federal case probably would have gotten more attention and media coverage, required more transparency and taken longer to complete, the experts said.

Mr. Hurst and other lawyers in the case said they had not filed any type of federal action against Centene.

A spokesperson for the Justice Department confirmed that it had inquired about the P.B.M. and Centene cases in Ohio, but no further federal action was taken. The department declined further comment.

Mary Inman, a lawyer at Whistleblower Partners L.L.P. with decades of experience, said one of the reasons Liston & Deas wound up in state court might have been that its case relied on whistle-blowers the federal government was unlikely to approve.

The whistle-blower in Texas was Mr. Hurst. In California, the whistle-blower was Matthew McDonald, a lawyer at David Nutt & Associates and the son of Bryan McDonald, who worked in Mr. Barbour’s administration when he was governor.

Ms. Inman said whistle-blowers are typically insiders with firsthand knowledge of wrongdoing who share information at some risk to themselves, not lawyers who gain information while on the job.

“It’s very unusual,” Ms. Inman said. “And it’s something that I, as a longtime lawyer in this space, I would not want to do because atmospherically and reputationally it doesn’t look great.”

Mr. Barbour said he believes everyone walked away from the settlements happy — including executives at Centene. As evidence, he cited the company’s stock performance.

“I can’t speak for them, but if I had agreed to pay a big settlement and my stock went up after the first day, I would think it was a pretty good settlement,” Mr. Barbour said.

Business Process Management (BPM) is a systematic approach to managing and streamlining business processes . BPM is intended to help improve the efficiency of existing processes, with the goal of increasing productivity and overall business performance.

BPM is often confused with other seemingly similar initiatives. For example, BPM is smaller in scale than business process reengineering (BPR), which radically overhauls or replaces processes. Conversely, it has a larger scope than task management, which deals with individual tasks, and project management, which handles one-time initiatives. And while enterprise resource planning (ERP) integrates and manages all aspects of a business, BPM focuses on its individual functions—optimizing the organization’s existing, repeatable processes end-to-end.

An effective BPM project employs structured processes, uses appropriate technologies and fosters collaboration among team members. It enables organizations to streamline project workflows, enhance productivity and consistently deliver value to stakeholders. Ultimately, the successful implementation of BPM tools can lead to increased customer satisfaction, competitive advantage and improved business outcomes.

3 main types of business process management

Integration-centric BPM focuses on processes that don’t require much human involvement. These include connecting different systems and software to streamline processes and improve data flow across the organization, for example human resource management (HRM) or customer relationship management (CRM)

Human-centric BPM centers around human involvement, often where an approval process is required. Human-centric BPM prioritizes the designing of intuitive processes with drag and drop features that are easy for people to use and understand, aiming to enhance productivity and collaboration among employees.

Document-centric BPM is for efficiently managing documents and content—such as contracts—within processes. A purchasing agreement between a client and vendor, for example, needs to evolve and go through different rounds of approval and be organized, accessible and compliant with regulations.

Business process management examples

BPM can help improve overall business operations by optimizing various business processes. Here are some BPM examples that outline the use cases and benefits of BPM methodology:

Business strategy

BPM serves as a strategic tool for aligning business processes with organizational goals and objectives. By connecting workflow management, centralizing data management , and fostering collaboration and communication, BPM enables organizations to remain competitive by providing access to accurate and timely data. This ensures that strategic decisions are based on reliable insights.

Through BPM, disparate data sources—including spend data, internal performance metrics and external market research—can be connected. This can uncover internal process improvements, strategic partnership opportunities and potential cost-saving initiatives. BPM also provides the foundation for making refinements and enhancements that lead to continuous improvement.

  • Enhanced decision-making
  • Efficient optimization
  • Continuous improvement

Claims management

BPM can be used to standardize and optimize the claims process from start to finish. BPM software can automate repetitive tasks such as claim intake, validation, assessment, and payment processing—using technology such as Robotic Process Automation (RPA ). By establishing standardized workflows and decision rules, BPM streamlines the claims process by reducing processing times and minimizing errors. BPM can also provide real-time visibility into claim status and performance metrics. This enables proactive decision-making, ensures consistency and improves operational efficiency.

  • Automated claim processing
  • Reduced processing times
  • Enhanced visibility

Compliance and risk management

By automating routine tasks and implementing predefined rules, BPM enables timely compliance with regulatory requirements and internal policies. Processes such as compliance checks, risk evaluations and audit trails can be automated by using business process management software, and organizations can establish standardized workflows for identifying, assessing, and mitigating compliance risks. Also, BPM provides real-time insights into compliance metrics and risk exposure, enabling proactive risk management and regulatory reporting.

  • Automated compliance checks
  • Real-time insights into risk exposure
  • Enhanced regulatory compliance

Contract management

Contract turnaround times can be accelerated, and administrative work can be reduced by automating tasks such as document routing, approval workflows and compliance checks. Processes such as contract drafting, negotiation, approval, and execution can also be digitized and automated. Standardized workflows can be created that guide contracts through each stage of the lifecycle. This ensures consistency and reduces inefficiency. Real-time visibility into contract status improves overall contract management.

  • Accelerated contract turnaround times
  • Real-time visibility into contract status
  • Strengthened business relationships

Customer service

BPM transforms customer service operations by automating service request handling, tracking customer interactions, and facilitating resolution workflows. Through BPM, organizations can streamline customer support processes across multiple channels, including phone, email, chat, and social media. With BPM, routine tasks such as ticket routing and escalation are automated. Notifications can be generated to update customers about the status of their requests. This reduces response times and improves customer experience by making service more consistent. BPM also provides agents with access to a centralized knowledge base and customer history, enabling them to resolve inquiries more efficiently and effectively.

  • Streamlined service request handling
  • Centralized knowledge base access
  • Enhanced customer satisfaction and loyalty

Financial management

BPM is used to streamline financial processes such as budgeting, forecasting, expense management, and financial reporting. It ensures consistency and accuracy in financial processes by establishing standardized workflows and decision rules, reducing the risk of human errors and improving regulatory compliance. BPM uses workflow automation to automate repetitive tasks such as data entry, reconciliation and report generation. Real-time visibility into financial data enables organizations to respond quickly to changing market conditions.

  • Increased operational efficiency
  • Instant insights for informed decision-making
  • Enhanced compliance with regulations and policies

Human resources

Using BPM, organizations can implement standardized HR workflows that guide employees through each stage of their employment experience, from recruitment to retirement . The new employee onboarding process and performance evaluations can be digitized, which reduces administrative work and allows team members to focus on strategic initiatives such as talent development and workforce planning. Real-time tracking of HR metrics provides insights into employee engagement, retention rates, and the use and effectiveness of training.

  • Reduced administrative work
  • Real-time tracking of HR metrics
  • Enhanced employee experience

Logistics management

BPM optimizes logistics management by automating processes such as inventory management, order fulfillment, and shipment tracking, including those within the supply chain. Workflows can be established that govern the movement of goods from supplier to customer. Automating specific tasks such as order processing, picking, packing and shipping reduces cycle times and improves order accuracy. BPM can also provide real-time data for inventory levels and shipment status, which enables proactive decision-making and exception management.

  • Streamlined order processing and fulfillment
  • Real-time visibility into inventory and shipments
  • Enhanced customer satisfaction and cost savings

Order management

BPM streamlines processes such as order processing, tracking, and fulfillment. BPM facilitates business process automation —the automation of routine tasks such as order entry, inventory management, and shipping, reducing processing times and improving order accuracy. By establishing standardized workflows and rules, BPM ensures consistency and efficiency throughout the order lifecycle. Increased visibility of order status and inventory levels enables proactive decision-making and exception management.

  • Automated order processing
  • Real-time visibility into order status
  • Improved customer satisfaction

Procurement management

BPM revolutionizes procurement management through the digital transformation and automation of processes such as vendor selection, purchase requisition, contract management, and pricing negotiations. Workflows can be established that govern each stage of the procurement lifecycle, from sourcing to payment. By automating tasks such as supplier qualification, RFx management, and purchase order processing, BPM reduces cycle times and improves efficiency. Also, with real-time metrics such as spend analysis, supplier performance, and contract compliance, BPM enables business process improvement by providing insights into areas suitable for optimization.

  • Standardized procurement workflows
  • Real-time insights into procurement metrics
  • Cost savings and improved supplier relationships

Product lifecycle management

BPM revolutionizes product lifecycle management by digitizing and automating processes such as product design, development, launch, and maintenance. Workflows that govern each stage of the product lifecycle, from ideation to retirement can be standardized. Requirements gathering, design reviews, and change management , can be automated. This accelerates time-to-market and reduces development costs. BPM can also encourage cross-functional collaboration among product development teams, which ensures alignment and transparency throughout the process.

  • Accelerated time-to-market
  • Reduced development costs
  • Enhanced cross-functional collaboration

Project management

In the beginning of this page, we noted that BPM is larger in scale than project management. In fact, BPM can be used to improve the project management process. Business process management tools can assign tasks, track progress, identify bottlenecks and allocate resources. Business process modeling helps in visualizing and designing new workflows to guide projects through each stage of the BPM lifecycle. This ensures consistency and alignment with project objectives. Tasks assignments, scheduling, and progress monitoring can be automated, which reduces administrative burden and improves efficiency. Also, resource utilization and project performance can be monitored in real time to make sure resources are being used efficiently and effectively.

  • Streamlined project workflows
  • Real-time insights into project performance
  • Enhanced stakeholder satisfaction

Quality assurance management

BPM facilitates the automation of processes such as quality control, testing, and defect tracking, while also providing insights into KPIs such as defect rates and customer satisfaction scores. Quality assurance (QA) process steps are guided by using standardized workflows to ensure consistency and compliance with quality standards. Metrics and process performance can be tracked in real time to enable proactive quality management. Process-mapping tools can also help identify inefficiencies, thereby fostering continuous improvement and QA process optimization.

  • Automated quality control processes
  • Real-time visibility into quality metrics

Business process management examples: Case studies

Improving procure-to-pay in state government.

In 2020, one of America’s largest state governments found itself in search of a new process analysis solution . The state had integrated a second management system into its procurement process, which required the two systems, SAP SRM and SAP ECC, to exchange data in real time. With no way to analyze the collected data, the state couldn’t monitor the impact of its newly integrated SAP SRM system, nor detect deviations during the procurement process. This created an expensive problem.

The state used IBM Process Mining to map out its current workflow and track the progress of the SAP SRM system integration. Using the software’s discovery tool, data from both management systems was optimized to create a single, comprehensive process model. With the end-to-end process mapped out, the state was able to monitor all its process activities and review the performance of specific agencies.

Streamlining HR at Anheuser-Busch

AB InBev wanted to streamline its complicated HR landscape by implementing a singular global solution to support employees and improve their experience, and it selected workday as its human capital management (HCM) software. Working with a team from  IBM® Workday consulting services , part of IBM Consulting™, AB InBev worked with IBM to remediate the integration between the legacy HR apps and the HCM software.

What was once a multi-system tool with unorganized data has become a single source of truth, enabling AB InBev to run analytics for initiatives like examining employee turnover at a local scale. Workday provides AB InBev with a streamlined path for managing and analyzing data, ultimately helping the company improve HR processes and reach business goals.

Business process management and IBM

Effective business process management (BPM) is crucial for organizations to achieve more streamlined operations and enhance efficiency. By optimizing processes, businesses can drive growth, stay competitive and realize sustainable success.

IBM Consulting offers a range of solutions to make your process transformation journey predictable and rewarding.

  • Traditional AI and generative AI-enabled Process Excellence practice uses the leading process mining tools across the IBM ecosystem and partners.
  • Our patented IBM PEX Value Triangle includes industry standards, benchmarks, and KPIs and is used to quickly identify process performance issues and assess where and how our clients can optimize and automate everywhere possible.
  • IBM Automation Quotient Framework and Digital Center of Excellence (COE) platform prioritized and speeds up automation opportunities, ultimately establishing a Process Excellence COE for continuous value orchestration and governance across your organization.

Key improvements might include 60-70% faster procurement, faster loan booking, and reduced finance rework rate, along with risk avoidance, and increased customer and employee satisfaction.

With principles grounded in open innovation, collaboration and trust, IBM Consulting doesn’t just advise clients. We work side by side to design, build, and operate high-performing businesses—together with our clients and partners.

More from Business transformation

Using generative ai to accelerate product innovation.

3 min read - Generative artificial intelligence (GenAI) can be a powerful tool for driving product innovation, if used in the right ways. We’ll discuss select high-impact product use cases that demonstrate the potential of AI to revolutionize the way we develop, market and deliver products to customers. Stacking strong data management, predictive analytics and GenAI is foundational to taking your product organization to the next level.   1. Addressing customer inquiries with an AI-driven chatbot  ChatGPT distinguished itself as the first publicly accessible GenAI-powered…

Integrating AI into Asset Performance Management: It’s all about the data

3 min read - Imagine a future where artificial intelligence (AI) seamlessly collaborates with existing supply chain solutions, redefining how organizations manage their assets. If you’re currently using traditional AI, advanced analytics, and intelligent automation, aren’t you already getting deep insights into asset performance? Undoubtedly. But what if you could optimize even further? That’s the transformative promise of generative AI, which is beginning to revolutionize business operations in game-changing ways. It may be the solution that finally breaks through dysfunctional silos of business units,…

Create a lasting customer retention strategy

6 min read - Customer retention must be a top priority for leaders of any company wanting to remain competitive. An effective customer retention strategy should support the company to maintain a healthy stable of loyal customers and bring in new customers. Generating repeat business is critical: McKinsey’s report on customer acquisition states (link resides outside of ibm.com) that companies need to acquire three new customers to make up the business value of losing one existing customer. Customer retention has become more difficult in…

IBM Newsletters

IMAGES

  1. IBM_Healthcare_CaseStudySM_NIH

    ibm healthcare case studies

  2. IBM Smarter Healthcare Networks

    ibm healthcare case studies

  3. Introduction to IBM App Connect for Healthcare

    ibm healthcare case studies

  4. How IBM’s Unified Data Model for Healthcare (UDMH) supports today’s

    ibm healthcare case studies

  5. IBM Data Management Healthcare Video

    ibm healthcare case studies

  6. An IBM Perspective: Healthcare in the Cloud

    ibm healthcare case studies

VIDEO

  1. Patient Advocacy and Case Management

  2. IBM Healthcare Industry 2020 Vision

  3. Surfaces for Healthcare: Case Studies

  4. Healthcare Business analyst

COMMENTS

  1. Healthcare

    IBM healthcare technology solutions designed to simplify operations, provide data insights and enable efficiency and resiliency to improve population health. Transforming healthcare for better outcomes ... Case studies Humana Humana reduced costly pre-service calls and improved the provider experience with conversational AI. ...

  2. Healthcare and Life Sciences

    Healthcare and Life Sciences. The combination of increasingly powerful computers and AI offers the possibility to be able to detect, diagnose, and cure diseases like never before. At IBM Research, we're working on creating software and AI systems that can convert reams of health data into useable information for clinicians the world over.

  3. Here's How IBM Watson Health Is Transforming the Health ...

    Most recently, Big Blue paid $2.6 billion for Truven Health Analytics, which will bring its total database to 300 million patients; the deal will also double IBM Watson's size to nearly 5,000 ...

  4. The Future of IBM's Watson Health

    As public perception noticed a growing gap between Watson's promise of "ushering in system wide healthcare innovation through the use of big data" and actual progress, the company decided to make leadership changes. Enter Paul Roma, IBM's Watson Health newly appointed general manager late July 2019. "Going forward, the word is focus ...

  5. PDF Responsible Use of Technology: The IBM Case Study

    The IBM Case Study WHITE PAPER SEPTEMBER 2021 In collaboration with the Markkula Center for Applied Ethics at Santa Clara University, USA. Contents Foreword ... healthcare, and the way businesses interact with their customers. In the process, they can also unlock moral value and benefits to individuals and society. If not responsibly

  6. PDF A series of case studies illuminating SETTING THE WORLD ON FHIR healthcare

    A series of case studies illuminating how HIT professionals are using HL7®FHIR to improve and advance modern healthcare. SETTING THE WORLD ON. PUBLISHED IN 2022. Published by HL7®International, a not-for-profit organization, Fast Healthcare Interoperability Resources (FHIR®) is a standard for exchanging healthcare information electronically.

  7. IBM Cloud Case Study

    IBM Cloud Case Study. I chose this IBM case study because it provides a clear overview of how legacy systems can be modernized through cloud adoption. The consulting firm Deloitte partnered with IBM to improve the Medicaid Management Information Systems (MMIS). The collaboration resulted in an innovative solution for state healthcare agencies ...

  8. Case Study: IBM

    Case Study: IBM - i3ARCHIVE. Business-to-Business, Business-to-Consumer, Business Performance Transformation, Business Resiliency, Content Management, Data Warehouse, Database Management, Digital Media, Grid Computing, Innovation that matters. i3ARCHIVE needed to give healthcare providers a means of managing and transporting digital medical ...

  9. Big data in healthcare: management, analysis and future prospects

    IBM's Watson Health is an AI platform to share and analyze health data among hospitals, providers and researchers. ... Studies have observed various physical factors that can lead to altered data quality and ... Cucciniello M, Guerrazzi C. The role of mobile technologies in health care processes: the case of cancer supportive care. J Med ...

  10. Challenges in Commercial Deployment of AI: Insights from The Rise and

    When IBM set about commercializing its artificial intelligence-driven Watson AI in the healthcare market, its early successes were widely publicized. Senior managers and the media claimed that its diagnostic features would soon surpass those of the sharpest doctors. The case describes the large gap between what was promised and what happened in practice, offering insider insights on why IBM ...

  11. Case Study: Hybrid Cloud for Health care in IBM Cloud

    Solution Provided. We at ne Digital, an IBM Advanced Business Partner, built an elastic hybrid cloud based on VMware vSphere that seamlessly expands GELA's local data center into two IBM Cloud data centers. This empowered their IT team to control all workloads, clusters, hosts, and virtual machines from a single vCenter with complete ...

  12. Blockchain in Health Care Innovation: Literature Review and Case Study

    Case study results were then collated to answer research question 2 and to elucidate the understanding of cooperative and competitive strategies and operational business schemes in the health care context. We selected the IBM blockchain-health care initiative as the body of the target case and combined the concept of the business ecosystem ...

  13. (PDF) IBM's Watson Analytics for Health Care:

    based learning to learn from li terature and case studies available. It then helps medical professionals . ... Building smarter wearable's for health-care. IBM Developer Works. (Retrieved January ...

  14. IBM Watson

    We tapped into global non-profit and social networks to reach out to people worldwide, listening to their struggles and ideas for better community service. Despite a tight budget, our goal was to make a far-reaching video. In this video, health and social program experts, along with their clients, openly share the challenges they encounter, the ...

  15. Workplace Mental Health

    minimize the cost and impact of behavioral health conditions on the workplace and healthcare costs. Preliminary results are positive. Last year the program generated savings of $500,000 in outpatient costs — $100,000 more than it generated in 2002. Juan Prieto, Health Benefits Program Manager at IBM, told Mental Health Work s how it was done.

  16. PDF A Detailed study of Big Data in Healthcare: Case study of Brenda and

    A Detailed study of Big Data in Healthcare: Case study of Brenda and IBM Watson Dipesh Jain, Vivek Kumar, Darpan Khanduja, Kamlesh Sharma, Ritika Bateja Abstract: Big data analytics will revolutionize the health care sector. It provides us the power to assemble, handle, analyze, and understand massive amount of different, organized and

  17. A detailed study of big data in healthcare: Case study of brenda and

    Request PDF | A detailed study of big data in healthcare: Case study of brenda and IBM Watson | Big data analytics will revolutionize the health care sector. It provides us the power to assemble ...

  18. Interprofessional Case Study Event Elevates Patient Care Strategies

    The Sixth Annual Interprofessional Case Study event, organized by Downstate's School of Health Professions (SOHP) in collaboration with the College of Nursing and the School of Public Health, took place on March 21. Engaging over 160 students in-person and virtually, the event underscored the importance of healthcare professionals working collaboratively in interdisciplinary teams to improve ...

  19. Ibm Healthcare Case Studies

    This type of work takes up to fourteen days. We will consider any offers from customers and advise the ideal option, with the help of which we will competently organize the work and get the final result even better than we expected. 4.8/5. 77. Customer Reviews. ID 13337. Ibm Healthcare Case Studies. Jan 27, 2021.

  20. Call for case studies and best practices on addressing TB in refugees

    The World Health Organization (WHO) Global Tuberculosis Programme is launching a call for case studies and best practices on addressing TB in refugees and other populations in humanitarian settings.TB remains one of the world's leading infectious killers causing an estimated 1.3 million deaths per year and an estimated 10.6 million people falling ill with this preventable and curable disease.

  21. Centene Health Care Fraud Case: How Private Lawyers Profited

    Centene agreed to pay the state nearly 10 times that amount. The attorney general's office in Washington, one of the few states where officials agreed to discuss basic details about the ...

  22. Blockchain use cases

    Read the case study . AI and Blockchain help discover and transact IP. IPwe helps companies make better use of their intellectual property. Yet the IP transaction platform saw inefficiencies and a lack of transparency in the ecosystem. ... How IBM Blockchain technology powers IBM Digital Health Pass. healthcare, platform,services, , , ibm ...

  23. Business process management (BPM) examples

    Business process management examples: Case studies Improving procure-to-pay in state government. In 2020, one of America's largest state governments found itself in search of a new process analysis solution. The state had integrated a second management system into its procurement process, which required the two systems, SAP SRM and SAP ECC ...