Medical Informatics Book
Medical Informatics Book
arrives in Medicine
After scientists developed the first computer in the 1940s, society was told that these
new machines would soon routinely serve as memory devices, helping with
calculations and information retrieval. Within the next decade, physicians and other
health professionals had begun to hear about the dramatic effects such technology
would have on clinical practice. More than six decades of remarkable progress in
computing have followed those early predictions, and many of the prophecies have
come true. Stories about the “information revolution” and “big data” fill our
newspapers and popular magazines, and today's children display an uncanny ability
to do
use of computers (including their increasing use of mobile versions) as routine tools
for study and entertainment. Similarly, clinical workstations have been available in
hospital wards and outpatient clinics for years, and are gradually being supplanted by
mobile devices with wireless connectivity. However, many observers cite the health
care system as being slow to understand information technology, slow to exploit it for
its unique practical and strategic capabilities, slow to incorporate it effectively into the
work environment, and slow to understand its strategic importance and consequent
need for investment and commitment. However, the enormous technological
advances of the last three decades—personal computers and graphical interfaces,
new methods for human-computer interaction, innovations
in massive data storage (both locally and in the “cloud”), mobile devices, personal
health monitoring devices and tools, the Internet, communications, social media, and
more, have all combined to make routine use of computers by all health care workers
and biomedical scientists inevitable. A new world is already with us, but its
the greatest influence is yet to come. This book teaches us about our current
resources and achievements and what to expect in the years to come.
When one considers the penetration of computers and communication into our daily
lives today, it is remarkable that the first human computers were introduced as
recently as the late 1970s; the local area network has been available only since
~1980; the World Wide Web dates only to the early 1990s; and smartphones, social
networks, and wireless communication are even more recent. This dizzying pace of
change,
combined with equally widespread and revolutionary changes in almost all
international health care systems.
systems, making it difficult for public health planners and health facility managers to
try to deal with both problems at the same time. However, many observers now
believe that the two issues are inextricably linked and that planning for new health
care in the coming decades requires a deep understanding of the role that
information technology is likely to play in those environments. What might that future
hold for the typical practicing physician? As we will discuss in detail in Chap. 12, No
topic in applied clinical computing is currently gaining more attention than the topic of
electronic health records (EHR). Healthcare organizations have recognized that they
do not have systems in place to effectively answer questions that are critical to
strategic planning, to better understand how they compare to other provider groups in
their local or regional competitive environment, and to report to regulatory agencies.
In the past, administrative and financial data were the main elements required for
such planning, but comprehensive clinical data are now also important for institutional
self-analysis and strategic planning. Furthermore, the inefficiencies and frustrations
associated with the use of paper medical records are now well accepted (Dick and
Steen 1991 (Revised 1997)), especially when inadequate access to clinical
information is one of the major barriers physicians encounter when attempting to
increase their efficiency in order to meet productivity goals for their practices.
topics (p. e.g., patient tracking within the hospital, materials and inventory
management, supporting personnel functions and payroll management), research
(e.g., analyzing outcomes associated with treatments and procedures, performing
quality assurance, supporting clinical trials, and implementing various treatment
protocols), academic reporting (e.g., e.g., access to digital information) libraries,
supporting bibliographic searching, and providing access to drug information
databases), and even office automation (e.g. (e.g. providing access to spreadsheets
and document management software). The key insight, however, is that at the heart
of evolving integrated environments lies electronic health records that are intended to
be accessible, confidential, secure, acceptable to physicians and patients, and
integrated with other types of useful information to assist in planning and problem
solving.
The use of EHRs has offered many advantages to those conducting clinical research
(see Chap. 26). Most obviously, it helps eliminate the manual task of extracting data
from charts or filling out specialized data sheets. The data needed for a study can
often be derived directly from the EHR, thus making much of what is required for
research data collection simply a byproduct of routine clinical record keeping (Fig.
1.5). Furthermore, the advantages also accumulate. For example, the registry
environment can help ensure compliance with a research protocol by signaling to a
physician when a patient is eligible for a study or when a study protocol requires a
specified management plan given the current available data about that patient. We
are also seeing the development of new authoring environments for clinical trial
protocols that can help ensure that data elements required for the trial are compatible
with local EHR conventions for representing patient descriptors.
Another issue in the changing world of health care is the increasing investment in
creating standard order sets, clinical guidelines, and clinical pathways (see Chap.
22), generally in an effort to reduce practice variability and develop consensus
approaches to managing recurring problems. Several governments and professional
organisations, as well as individual provider groups, have invested heavily in
developing guidelines, often placing emphasis on using evidence from the literature,
rather than expert opinion alone, as a basis for advice. Despite the success in
creating such evidence-based guidelines, there is a growing recognition that we need
better methods for bringing decision logic to the point of care. Guidelines that appear
in monographs or journal articles tend to sit on shelves, unavailable when the
knowledge they contain would be most valuable to practitioners. Informatics tools to
implement such guidelines, and integrate with the EHR, present a means to make
high-quality advice available in the routine clinical setting.
Efforts that continue to advance the state of the art in Internet technologies all have
significant implications for the future of health care delivery in general and EHRs and
their integration in particular (Shortliffe 1998b, 2000). But beyond increasing the
speed, reliability, security and availability of the Internet, there are many other areas
that need attention if the vision of a learning healthcare system is to be achieved.
There is a difference between computer literacy (familiarity with computers and their
routine uses in our society) and knowledge of the role that computing and
communications technology can and should play in our health care system. We are
generally doing a poor job of training future physicians in this latter area and are
therefore leaving them ill-equipped for the challenges and opportunities they will face
in the rapidly changing practice environments that surround them (Shortlife 2010).
Moreover, much of the vision for the future they have proposed here can be achieved
only if educational institutions produce a cadre of talented individuals who not only
understand computing and communications technology, but also have a deep
understanding of the biomedical environment and the needs of physicians and other
health care workers. Computer science training alone is not adequate. Fortunately,
we have begun to see the creation of formal training programs in what has become
known as biomedical informatics (see Sect. 1.4 ) that provide personalized
educational opportunities. Many of the trainees are life-long research scientists,
physicians, nurses, pharmacists, and other health care professionals who see
professional opportunities and challenges at the intersections of biomedicine,
information science, computer science, decision science, cognitive science, and
communication technology. As has been clear for more than two decades (Greenes
and Shortliffe 1990), however, demand for such individuals far outstrips supply, both
for academic and industrial career paths. 4.5 We need more training programmes, 6
expansion of those that already exist, as well as support for young teachers in health
science faculties who wish to pursue further training in this area.
During the first decades of the evolution of clinical information systems for use in
hospitals, patients, and public health, the primary role of government was to support
the research enterprise as new methods were developed, tested, and formally
evaluated. The topic was rarely mentioned by the nation's leaders, however, even
during the 1990s when the White House was considered especially tech-savvy. It
was therefore notable when, in the President's State of the Union address in 2004
(and in each of the following years of his administration), President Bush called for
universal implementation of electronic health records within 10 years. Health and
Human Services Secretary Tommy Thompson also supported this, and in May 2004
created an entity intended to support expanded EHR use: the Office of the National
Coordinator for Health Information Technology (initially referred to by the full acronym
ONCH IT, but later abbreviated to simply ONC). There was a limited budget for the
ONC, although the organization served as a convening body for planning efforts
related to EHR and the National Health Information Infrastructure (see chaps. 12, 13
and 27). The topic of EHRs subsequently became a talking point for both major
candidates during the 2008 presidential election, with strong bipartisan support.
However, it was the American Recovery and Reinvestment Act
(ARRA) in early 2009, also known as the economic “Stimulus Act,” which first
provided significant funding to provide tax incentives for health systems, hospitals,
and providers to implement EHRs in their practices. Such payments were available,
however, only when eligible organizations or individual professionals implemented
EHRs that were "certified" as meeting minimum standard requirements and when
they could document them. were making “significant use” of those systems. You will
see references to this certification. and meaningful use criteria in many chapters in
this volume. There is also a discussion of HIT policy and the federal government in
Chap. 27 Although the EHR implementation process is still ongoing at present, the
trend is clear: because of the federal stimulus package, a large number of hospitals,
systems, and physicians are investing in EHRs and incorporating them into their
practices. In addition, the demand for workers trained in health information
technology has grown much faster than the labor market, even within health care
(Fig. 1.10). It is a striking example of how government policy and investment can spur
important transitions in systems such as health care, where many observers had
previously felt that progress had been unacceptably slow (Shortliffe 2005).
1.4.1Terminology
Since the 1960s, when increasing numbers of people doing serious biomedical
research or clinical practice had access to some form of computer system, people
have been unsure what name they should use for the biomedical application of
computer science concepts. The name computer science was new in 1960. and was
only vaguely defined. Even today, the term computer science is used more as a
matter of convention than as an explanation of the field's scientific content. In the
1970s we began using the phrase medical informatics to refer to the subdivision of
informatics that applies the methods of the broader field to medical subjects. As you
may wish to see, however, medicine has provided a rich area for computer science
research, and several computer science insights and methodologies have been
derived from applied medical computing research.
neutral, implying only that computers are used for some purposes in biology or
medicine. They are often associated with bioengineering applications of computers,
however, where the devices are viewed more as tools for a bioengineering
application than as a primary focus of research.
In the 1970s, inspired by the French term for informatics, the English-speaking
community began using the term medical informatics. Those in the field were
attracted by the word's emphasis on information, which they saw as more central to
the field than the computer itself, and it gained momentum as a term for the
discipline, especially in Europe, during the 1980s. The term is broader than medical.
computing (including topics such as medical statistics, record keeping, and the study
of the nature of medical information itself) and de-emphasizes the computer while
focusing instead on the nature of the field in which the applied computations are
performed. Because the term informatics became widely accepted in the United
States only in the late 1980s, medical information science was also previously used
in North America; this term, however, can be confused with librarianship, and does
not capture the broader implications of the European term. As a result, the name
medical informatics emerged in the late 1980s and has become the preferred term,
even in the United States. In fact, this is the name of the field we used in the first two
editions of this textbook (from 1990 to 2000), and it is still sometimes used in
professional, industrial, and academic settings. However, many observers expressed
concern that the adjective "medical" is too physician-centric and fails to appreciate
the relevance of this discipline to other health services and life sciences
professionals. Thus, the term health informatics, or health care informatics, gained
some popularity, although it has the disadvantage that it tends to exclude applications
to biomedical research (chaps. 24 and 25) and, as I argued shortly, tends to focus the
field name on domains of application (clinical care, public health, and prevention)
rather than the basic discipline and its broad range of applicability. Applications of
computational methods in biology and genetics exploded during the 1990s due to the
human genome project7 and the growing recognition that modern life sciences
research was no longer possible without computational support and analysis (see
Chapters 24 and 25). By the late 1990s, the use of computer methods in such work
had become widely known as bioinformatics, and the director of the National
Institutes of Health (NIH)
appointed an advisory group called the Working Group on Biomedical Informatics. In
June 1999, the group provided a report 8 recommending that the NIH undertake an
initiative called the Biomedical Science and Technology Information Initiative (BISTI).
With the subsequent creation of another NIH organization called the Bioinformatics
Working Group, the visibility of computer applications in biology was greatly
enhanced. Today, bioinformatics is a major area of activity at the NIH 9 and at many
universities and biotechnology companies around the world. The explosive growth of
this field, however, has added to the confusion regarding the naming conventions we
have been discussing. Furthermore, the relationship between medical informatics and
bioinformatics became unclear. As a result, in an effort to be more inclusive and
embrace the biological applications with which many medical informatics groups had
already been involved, the name medical informatics gradually gave way to
biomedical. computer science (IMC). Several academic groups have changed their
names, and a major medical informatics journal (Computers and Biomedical
Research) was reborn in 2001 as The Journal of Biomedical Informatics. 10 Despite
this convoluted naming history, we believe that the wide range of problems in
biomedical information management requires an appropriate name, and, beginning
with the third edition of this book (2006), we use the term biomedical informatics for
this purpose. It needs to become the most widely accepted term for a core discipline
and should be seen to broadly encompass all areas of application in health, clinical
practice and biomedical research. When we talk specifically about computers and
their use within biomedical informatics activities, we use the terms biomedical
informatics (for methodological reasons) or biomedical computing (to describe the
activity itself). Keep in mind, however, that biomedical informatics has many other
component sciences besides computer science. These include decision sciences,
statistics, cognitive science, information science, and even management sciences.
We return to this point shortly when we discuss the basic versus applied nature of the
field when viewed as a basic research discipline. Although labels like these are
arbitrary, they are by no means insignificant. In the case of new fields of activity or
branches of science, they are important both in designating the field and in defining
or restricting its content. The most distinctive feature of the modern computer is the
generality of its application. The almost unlimited range of computer uses
complicates
the matter of naming the field. As a result, the nature of computer science is perhaps
best illustrated by examples rather than by attempts at formal definition. Much of this
book presents examples that do exactly this for biomedical informatics as well. The
American Medical Informatics Association (AMIA), which was founded in the late
1980s under the discipline's former name, has acknowledged the confusion regarding
the field and its definition. 11 They accordingly appointed a task force to develop a
definition of the field and to specify the core competencies that should be acquired by
students seeking graduate training in the discipline. The resulting definition, published
in the AMIA Diario magazine and approved by the organization's Plenary, identifies
the focus of the field in a simple sentence and then adds four clarifying corollaries
that refine the definition and the scope and content of the field (Table 1.1). We adopt
this definition, which is very similar to the one we offered in previous editions of this
text. This recognizes that the emergence of informatics biomedicine as a new
discipline is largely due in part to rapid advances in computer and communications
technology, a growing awareness that the knowledge base of biomedicine is
essentially unmanageable by traditional paper-based methods, and a growing
conviction that the process of informed decision-making is as important to modern
biomedicine as is the collection of facts on which clinical decisions are based or
research plans are developed.
1.4.2Historical Perspective
The modern digital computer emerged from developments in the United States and
abroad during World War II, and general-purpose computers began to appear on the
market in the mid-1950s (Fig. 1.11). Speculation about what could be done with such
machines (if it was ever reliable) had begun, however, much earlier. Scholars, at
least since the Middle Ages, had often asked the question of whether human
reasoning can be explained in terms of formal or algorithmic processes. Gottfried
Wilhelm von Leibnitz, a 17th-century German philosopher and mathematician,
attempted to develop a calculus that could be used to simulate human reasoning.
The notion of a “motor logic” was further developed by Charles Babbage in the mid-
19th century. The first practical application of automatic computing relevant to
medicine was that of Herman Hollerith. development of a punch card data processing
system for the US census. US from 1890 (Fig. 1.12). His methods were soon adapted
to epidemiology and public health surveys, ushering in the era of electromechanical
punched card data processing technology, which matured and was widely adopted
during the 1920s and 1930s. These techniques were the precursors of the stored
program and fully electronic digital computers, which began to appear in the late
1940s (Collen 1995). One of the first activities of biomedical computing was the
attempt to build systems that would assist a physician in decision making (see Chap.
22). Not all biomedical computing
However, the programs followed this course. Many of the early ones instead
investigated the notion of a comprehensive hospital information system (HIS; see
Chap. 14). These projects were perhaps less ambitious in the sense that they were
more concerned with short-term practical applications; the difficulties they
encountered, however, were
still fearsome. The earliest work on HIS in the United States was probably that
associated with General Electric's MEDINET project, followed by work at Bolt,
Beranek, Newman in Cambridge, Massachusetts, and then at Massachusetts
General Hospital (MGH) in Boston. Barnett and
its partners for three decades from
early 1960s. Work on similar systems was done by Warner at the Latter-day Saints
(LDS) Hospital in Salt Lake City, Utah, by Collen at Kaiser Permanente in Oakland,
California, by
Weiderhold at Stanford University in Stanford, California, and by scientists at
Lockheed in Sunnyvale, California. 12
The course of SIS applications branched into
the 1970s One approach was based on the concept of an integrated or monolithic
design in
that a single large, shared computer over time would be used to support an entire
collection of applications. An alternative was a distribution
The exciting achievements of biomedical informatics, and the potential for future
benefits to medicine, must be seen in the context of our society and the existing
healthcare system. As early as 1970, an eminent physician suggested that computers
might eventually have a revolutionary influence on medical care, medical education,
and even the selection criteria for health science students (Schwartz 1970).
The subsequent huge growth in computing activity has been greeted with some
concern from the health sector. professionals ask where it will all end. Willpower
Will healthcare workers be gradually replaced by computers? Will nurses and doctors
need to be highly trained in computer science or informatics before they can practice
their professions effectively?
Both patients and health care workers eventually
rebel rather than accept a trend toward automation that they believe may threaten the
traditional
of possible areas of application (Fig. 1.19). He
analogy with other basic sciences is that biomedical informatics uses the results of
past experience
understand, structure and codify objectives and
subjective biomedical findings and thus make
they are suitable for processing. This approach supports the integration of findings
and their analyses. In turn, the selective distribution of new
The knowledge created can aid patient care, health planning, and basic biomedical
research.
Biomedical informatics is, by its nature, a
Experimental science, characterized by asking questions, designing experiments,
performing analysis, and using the information obtained
to design new experiments. One goal is simply to seek new knowledge, called
background knowledge.
investigate. A second objective is to use this knowledge
for practical purposes, called (applied) applications
investigate. There is a continuity between these
two efforts (see Fig. 1.19). in biomedical
computing, there is a particularly close coupling
Among the areas of application, broad categories
of which are indicated at the bottom of Fig. 1.19, and the identification of the basic
research tasks that characterize the scientific foundations of the field. Research,
however, has shown that there can be a very long time lag between the development
of new concepts and methods in basic research and their eventual application in
the biomedical world (Balas and Boren 2000).
Furthermore (see Fig. 1.20), many discoveries
are discarded along the way, leaving only a small
percentage of basic research discoveries that
have a practical influence on health and care
of patients
Work in biomedical informatics (BMI) is
intrinsically motivated by the problems encountered in a set of applied domains in
biomedicine. The first of these has historically been clinical care.
(including medicine, nursing, dentistry and veterinary care), an area of activity that
demands
patient-oriented computer applications. Us
refer to this area as clinical informatics. That
includes several subtopics and areas of specialized expertise, including patient care
focuses such as
Nursing Informatics, Dental Informatics and
including veterinary informatics. Furthermore, the discipline's former name, medical
informatics, is now reserved for applied research and practice topics that focus on
disease and the role of physicians. As discussed above, the term “medical
informatics” is no longer used to refer to the discipline as a whole.
Closely linked to clinical informatics is public health informatics (Fig. 1.19), where
similar methods are generalized for application to patient populations rather than to
individual individuals (chap. 16). Thus, clinical informatics and public health
informatics share many of the same methods and techniques. Two other major areas
of application overlap in some ways with clinical informatics and public health
informatics. These include image informatics (and the set of issues developed around
radiology and other image management and image analysis domains such as
pathology, dermatology, and molecular visualization—see Chaps. 9 and 20). Finally,
there is the burgeoning area of bioinformatics, which at the molecular and cellular
level is offering challenges that rely on many of the same computational methods as
well (see Chap. 24).
As shown in Fig. 1.21, there is a spectrum as one moves from left to right across
these IMC application domains. In bioinformatics, workers deal with molecular and
cellular processes in the application of computer methods. At the next level, workers
focus on tissues and organs, which tend to be the emphasis of imaging informatics
work (also called structural informatics in some institutions). Moving into clinical
informatics, the focus shifts to individual patients and eventually to public health,
where researchers address population and societal problems. The core science of
biomedical informatics has important contributions to make across that spectrum, and
many computational methods are broadly applicable across the same range of
domains.
Note in Fig. 1.19 Biomedical informatics and bioinformatics are not synonyms and it
is incorrect to refer to the scientific discipline as bioinformatics, which is, rather, an
important
area of application of BMI methods and concepts. Similarly, the term health
informatics, which refers to applied research and practice in clinical and public health
informatics, is also not a proper name for the core discipline, since BMI is also
applicable to basic human biology. in terms of health. boundaries and many areas of
applied computing research involve more than one of the categories. For example,
biomolecular imaging involves both bioinformatics and image informatics. concepts.
Similarly, consumer health informatics (see Chapter 17) includes elements of both
clinical informatics and public health informatics.
Another important area of BMI's research activities is pharmacogenomics (see Chap.
25), which
is the effort to infer the genetic determinants of
human response to drugs. Such work requires the
analysis of linked genotypes and phenotypes
databases, and is therefore at the intersection of bioinformatics and clinical
informatics.
In general, BMI researchers draw their inspiration from one of the application areas,
identifying fundamental methodological issues that need to be addressed and testing
them on prototype systems or, for more mature methods, on real systems.
systems that are used in clinical applications or biomedical research scenarios. An
important implication of this view is that the core discipline is identical regardless of
the area of application that a
a certain individual is motivated to address, although
Some BMI methods have greater relevance for
some domains than others. This advocates for unified BMI educational programs,
which provide
bring together students with a wide variety of application interests. Elective courses
and internships in
Specific interest areas are, of course, important to complement the core exposures
that students should receive, but, given the need for teamwork and understanding in
the field, separating learners based on the application areas they may be interested
in would be counterproductive and wasteful. 14
BMI's scientific contributions can also
be appreciated for their potential to benefit the education of health professionals
(Shortliffe 2010). For example, in education
of medical students, the various cognitive functions The activities of physicians have
traditionally tended to be considered separately and in isolation—they have largely
been treated as if they were independent and distinct modules of performance. An
activity that is attracting increasing interest is formal medical decision making (see
Chap. 3). The specific content of this area remains to be fully defined, but the
discipline's reliance on formal methods and its use of knowledge and information
reveal it to be one aspect of biomedical informatics. A particular topic in the study of
medical decision making is diagnosis, which is often conceived and taught as if it
were an independent activity. Medical students may be led to view diagnosis as a
process that physicians carry out in isolation before choosing therapy for a patient or
moving on to other modular tasks. A number of studies have shown that this model is
oversimplified and that such a decomposition of cognitive tasks can be quite
misleading (Elstein et al. 1978; Patel and Groen 1986). Doctors seem to be dealing
with several tasks at the same time. Although a diagnosis may be one of the first
things doctors think about when they see a new patient, patient assessment
(diagnosis, management, analyzing treatment outcomes, monitoring disease
progression, etc.) is a process that is never really over. A doctor must be flexible and
open-minded. It is usually appropriate to modify the original diagnosis if it turns out
that treatment based on it is unsuccessful or if new information weakens the evidence
supporting the diagnosis or suggests a second, concurrent disorder. These issues
are discussed in more detail in Chapter 4. When we talk about making a diagnosis,
choosing a treatment, managing a therapy, making decisions, monitoring a patient, or
preventing a disease, we are using labels for different aspects of health care, an
entity that has an overall unit. The fabric of healthcare is a continuum in which these
elements are closely intertwined. Regardless of whether we consider computer
science and informatics as a profession, a technology or a science, there is no doubt
about their importance to biomedicine. We can assume that computers will be used
more and more in clinical, biomedical practice.
research and education in health sciences.
Information
From the above discussion, one might conclude that biomedical
applications do not pose any problems.
any unique problem or concern. In contrast, the biomedical
environment poses several
topics that, interestingly, are quite distinct from those found in
most other domains of applied computing. The clinical
information seems
be systematically different from information used in physics,
engineering or even clinical
chemistry (which is more like general chemical applications than
medical applications)
some). Aspects of biomedical information include an essence of
uncertainty—we can never know everything about a
physiological process, and this
gives rise to inevitable variability between individuals. These
differences pose special problems and
demanding.
The second factor is the increase in the number of professionals who are being
trained for
understand biomedical problems, as well as the
technicians and engineers. Computer scientists who understand biomedicine
are more capable
design systems that respond to real needs and are sensitive to workflow and
clinical culture.
Health professionals who receive formal training
At BMI they are likely to build systems using well-established techniques while
avoiding the past.
mistakes from other developers. As more professionals are trained in the
special aspects of both
fields, and as the programs they develop are introduced, health care
professionals are more likely to have useful and usable systems available when
they turn to the computer for help with
information management tasks.
The third factor affecting the integration of
Information technology in healthcare settings
It is managed care and the increasing pressure to
control medical spending. The growing trend to apply technology to all patient
care
tasks is a frequently cited phenomenon in modern medical practice. Mere
physical findings do not
Longer ones are considered suitable for making
diagnosis and treatment planning. In fact, medical students who are taught by
more experienced physicians to find subtle diagnostic signs by examining
various parts of the body, however
They often choose to bypass or downplay physical exams in favor of ordering
one test after another. Sometimes they do it without paying.
sufficient attention to the resulting cost. Some new ones
Technologies replace less expensive, but technologically inferior, tests. In such
cases, the use of