Many CIOs have asked me: First, I want to review the history of healthcare information systems and how analytics came to be so important.
I’ve been in the industry long enough to see many of these changes occur, and I believe that a data warehouse and analytics tools are core components of any CIO’s application portfolio.
Because computers and storage were so large and expensive, hospitals typically shared a mainframe.
The principal applications emerging in this environment were shared hospital accounting systems.
1970s: One of the main healthcare drivers in this era was the need to do a better job communicating between departments (ADT, order communications, and results review) and the need for discrete departmental systems (e.g., clinical lab, pharmacy).
Computers were now small enough to be installed in a single department without environmental controls. Unfortunately, these transactional systems, embedded in individual departments, were typically islands unto themselves.
1990s: In this decade, competition and consolidation drove healthcare, along with the need to integrate hospitals, providers, and managed care.
From an IT perspective, hospitals now had access to broad, distributed computing systems and robust networks.
Reporting systems embedded in a transaction system clearly can’t do that.
Reporting systems typically exist as components of transactions systems.
Historically, this reporting provided snapshots of information about the hospital to management, the board, or other groups.
All of a sudden, we found ourselves with enormous amounts of data siloed in multiple, discrete applications. Brent James at Intermountain Healthcare, began to articulate to the industry that improving operational performance would require health systems to merge and analyze this data.
Another focus of hospital information system implementation over the years has been reporting.