0% found this document useful (0 votes)
41 views32 pages

Medical Informatics Book

The document discusses the evolution of medical informatics, highlighting the significant impact of information technology on healthcare practices since the 1940s. It emphasizes the transition from traditional paper-based medical records to electronic health records (EHRs), which enhance efficiency, data accessibility, and support clinical research. The text also explores the integration of communications technology and the Internet in improving health data management and patient care.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views32 pages

Medical Informatics Book

The document discusses the evolution of medical informatics, highlighting the significant impact of information technology on healthcare practices since the 1940s. It emphasizes the transition from traditional paper-based medical records to electronic health records (EHRs), which enhance efficiency, data accessibility, and support clinical research. The text also explores the integration of communications technology and the Internet in improving health data management and patient care.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 32

MEDICAL INFORMATICS BOOK.

1.1 The information revolution

arrives in Medicine

After scientists developed the first computer in the 1940s, society was told that these
new machines would soon routinely serve as memory devices, helping with
calculations and information retrieval. Within the next decade, physicians and other
health professionals had begun to hear about the dramatic effects such technology
would have on clinical practice. More than six decades of remarkable progress in
computing have followed those early predictions, and many of the prophecies have
come true. Stories about the “information revolution” and “big data” fill our
newspapers and popular magazines, and today's children display an uncanny ability
to do
use of computers (including their increasing use of mobile versions) as routine tools
for study and entertainment. Similarly, clinical workstations have been available in
hospital wards and outpatient clinics for years, and are gradually being supplanted by
mobile devices with wireless connectivity. However, many observers cite the health
care system as being slow to understand information technology, slow to exploit it for
its unique practical and strategic capabilities, slow to incorporate it effectively into the
work environment, and slow to understand its strategic importance and consequent
need for investment and commitment. However, the enormous technological
advances of the last three decades—personal computers and graphical interfaces,
new methods for human-computer interaction, innovations
in massive data storage (both locally and in the “cloud”), mobile devices, personal
health monitoring devices and tools, the Internet, communications, social media, and
more, have all combined to make routine use of computers by all health care workers
and biomedical scientists inevitable. A new world is already with us, but its
the greatest influence is yet to come. This book teaches us about our current
resources and achievements and what to expect in the years to come.

When one considers the penetration of computers and communication into our daily
lives today, it is remarkable that the first human computers were introduced as
recently as the late 1970s; the local area network has been available only since
~1980; the World Wide Web dates only to the early 1990s; and smartphones, social
networks, and wireless communication are even more recent. This dizzying pace of
change,
combined with equally widespread and revolutionary changes in almost all
international health care systems.

systems, making it difficult for public health planners and health facility managers to
try to deal with both problems at the same time. However, many observers now
believe that the two issues are inextricably linked and that planning for new health
care in the coming decades requires a deep understanding of the role that
information technology is likely to play in those environments. What might that future
hold for the typical practicing physician? As we will discuss in detail in Chap. 12, No
topic in applied clinical computing is currently gaining more attention than the topic of
electronic health records (EHR). Healthcare organizations have recognized that they
do not have systems in place to effectively answer questions that are critical to
strategic planning, to better understand how they compare to other provider groups in
their local or regional competitive environment, and to report to regulatory agencies.
In the past, administrative and financial data were the main elements required for
such planning, but comprehensive clinical data are now also important for institutional
self-analysis and strategic planning. Furthermore, the inefficiencies and frustrations
associated with the use of paper medical records are now well accepted (Dick and
Steen 1991 (Revised 1997)), especially when inadequate access to clinical
information is one of the major barriers physicians encounter when attempting to
increase their efficiency in order to meet productivity goals for their practices.

1.1.1Integrated Access to Clinical Services


Information: The future is now encouraged by health information technology (HIT)
vendors (and by the U.S. government, as discussed below), most health care
institutions are attempting to develop computer systems integrated into information
management environments. These are single entry points into a clinical world where
computational tools assist not only with patient care issues (reporting test results,
allowing direct entry of patient orders or information by physicians, facilitating access
to transcribed reports, and in some cases supporting telemedicine or decision
support applications or functions) but also administrative and financial issues.

topics (p. e.g., patient tracking within the hospital, materials and inventory
management, supporting personnel functions and payroll management), research
(e.g., analyzing outcomes associated with treatments and procedures, performing
quality assurance, supporting clinical trials, and implementing various treatment
protocols), academic reporting (e.g., e.g., access to digital information) libraries,
supporting bibliographic searching, and providing access to drug information
databases), and even office automation (e.g. (e.g. providing access to spreadsheets
and document management software). The key insight, however, is that at the heart
of evolving integrated environments lies electronic health records that are intended to
be accessible, confidential, secure, acceptable to physicians and patients, and
integrated with other types of useful information to assist in planning and problem
solving.

1.1.2Going beyond paper


Record

The traditional paper-based medical record is now recognized as woefully


inadequate to meet the needs of modern medicine. It emerged in the 19th
century as a "laboratory notebook" that physicians could use to record their
observations and plans so that relevant details could be recalled the next time
they saw the same patient. There were no regulatory requirements, no
assumptions that the registry would be used to support communication
between multiple care providers, and few data or test results to populate the
registry pages. The record that met the needs of physicians a century ago
struggled mightily to adapt over the decades and to adapt to new requirements
such as health, care and medicine. Today the failure to change paper charts to
serve the best interests of the patient, the physician, and the health care
system needs to be clarified (see Chapters 12 and 14). Most organizations
have found it challenging (and expensive) to move to a paperless electronic
health record. This observation forces us to ask the following questions: “What is
a health plan? record in the modern world? Products and systems are available that
are well-matched to modern notions of holistic health. Registration? Do they meet
people's needs? users, as well as the health systems themselves?” The complexity
associated with automating clinical care records is best appreciated if one looks at
the processes associated with creating and using such records rather than thinking of
the record as a physical object that can be moved as needed within the institution.
For example, on the input side (Fig. 1.1), the EHR requires the integration of
processes for data capture and for merging information from various sources. The
content of the paper record. They have traditionally been organized chronologically,
often a severe limitation when a physician is looking to find a specific piece of
information that could occur almost anywhere within the chart. To be useful, the
record system must make it easy to access and display the necessary data, to
analyze and share it among colleagues and with secondary users of the record who
are not involved in direct patient care (Fig. 1.2). Therefore, the EHR is best viewed
not as an object, or a product, but as a set of processes that an organization must put
in place, supported by technology (Fig. 1.3). Electronic record implementation is
inherently a systems integration task; a medical record system for a complex
organization cannot be purchased as an off-the-shelf product. Co-development and
local adaptation are crucial, which implies that institutions purchasing such systems
must have local expertise that can oversee and facilitate an effective implementation
process, including the elements of process re-engineering and cultural change that
are inevitably involved.

Experience has shown that physicians are “horizontal” users of information


technology (Greenes and Shortliffe 1990). Rather than becoming “power users” of a
narrowly defined software package, they tend to seek broad functionality across a
wide variety of systems and resources. Therefore, routine use of computers and
EHRs is most easily achieved when the computing environment offers a critical mass
of functionality that makes the system seamlessly integrated with workflow and useful
for essentially every patient encounter. The arguments for automating clinical care
records are summarized in Chaps. 2 and 12 and in the now classic report of the
Institute of Medicine on computer-based patient records ( CPR ) ( Dick and Steen
1991 (revised in 1997)). One argument that deserves emphasis is the importance of
EHRs in supporting clinical trials: experiments in which data from specific patient
interactions are pooled and analyzed in order to learn about the safety and
effectiveness of new treatments or tests and to gain insight into disease processes
that are otherwise not well understood. Medical researchers have in the past been
limited by clumsy methods for acquiring the data needed for clinical research. trials,
which are generally based on the manual capture of information in data sheets that
are then transcribed into computer databases for statistical analysis (Fig. 1.4). The
approach was laborious, fraught with opportunities for error, and compounded by the
high costs associated with prospective randomized research protocols.

The use of EHRs has offered many advantages to those conducting clinical research
(see Chap. 26). Most obviously, it helps eliminate the manual task of extracting data
from charts or filling out specialized data sheets. The data needed for a study can
often be derived directly from the EHR, thus making much of what is required for
research data collection simply a byproduct of routine clinical record keeping (Fig.
1.5). Furthermore, the advantages also accumulate. For example, the registry
environment can help ensure compliance with a research protocol by signaling to a
physician when a patient is eligible for a study or when a study protocol requires a
specified management plan given the current available data about that patient. We
are also seeing the development of new authoring environments for clinical trial
protocols that can help ensure that data elements required for the trial are compatible
with local EHR conventions for representing patient descriptors.
Another issue in the changing world of health care is the increasing investment in
creating standard order sets, clinical guidelines, and clinical pathways (see Chap.
22), generally in an effort to reduce practice variability and develop consensus
approaches to managing recurring problems. Several governments and professional
organisations, as well as individual provider groups, have invested heavily in
developing guidelines, often placing emphasis on using evidence from the literature,
rather than expert opinion alone, as a basis for advice. Despite the success in
creating such evidence-based guidelines, there is a growing recognition that we need
better methods for bringing decision logic to the point of care. Guidelines that appear
in monographs or journal articles tend to sit on shelves, unavailable when the
knowledge they contain would be most valuable to practitioners. Informatics tools to
implement such guidelines, and integrate with the EHR, present a means to make
high-quality advice available in the routine clinical setting.

1.1.3Anticipating the future of electronic health records

One of the first instincts of software developers is to create an electronic version of


an object or process in the physical world. Some familiar notion provides the
inspiration for a new software product Once a software version has been developed,
however, human ingenuity and creativity often lead to an evolution that extends the
software version far beyond what was initially contemplated. The computer can
therefore facilitate paradigm shifts in the way we think about such familiar concepts.
Consider, for example, the remarkable difference between today's office automation
software and the typewriter, which was the original inspiration for the development of
“word processors.” Although early word processors were largely designed to allow
users to avoid having to retype documents every time a minor change was made to a
document, today's document management software bears little resemblance to a
typewriter. Consider all the powerful desktop publishing tools—ease of insertion,
figure integration, spell checking, grammar aids, Web “publishing,” use of color, and
so on. Likewise, today's spreadsheet programs bear little resemblance to the tables
of numbers we once created on graph paper. A Take an example from the financial
world, consider Automated Teller Machines (ATMs) and their facilitation of today's
global banking in ways that were never contemplated when the industry relied on
human bank tellers. It is therefore logical to ask what health record will become after
it has been effectively implemented in new computer systems and the opportunities
for its improvement become increasingly clear to us. It's clear that EHRs a decade
from now will be markedly different from the antiquated paper binders that until
recently dominated most of our healthcare environments. Note that the state of EHR
today is roughly comparable to the state of commercial aviation in the 1930s. At that
time, air travel had progressed substantially since the days of the Wright brothers,
and air travel was becoming commonplace. But 1930s air travel seems archaic by
modern standards, and it's logical to assume that today's EHRs, while far better than
the paper records and early computer systems of the 1960s and 1970s, will be
greatly improved and modernized even further in the decades to come. If people had
not been able to use early airplanes to travel, the quality and efficiency of airplanes
and air travel would not have improved as they have. A similar point can be made
about the importance of committing to using EHRs today, even though we know they
need to be much better in the future.

1.2 Communications Technology and Health Data Integration

An obvious opportunity to change the role and functionality of clinical-care


records in the digital age is the power and ubiquity of the Internet. The Internet
began in 1968 as a US research project. activity funded by the Department of
Defense's Advanced Research Projects Agency (ARPA). Initially known as
ARPANET,
The network began as a novel mechanism to allow a handful of defense-
related mainframe computers, located primarily at academic institutions or the
research facilities of military contractors, to share data files with each other and
provide remote access to computing power elsewhere. The notion of email
soon emerged. From then on, electronic machine-to-machine mail exchanges
quickly became an important component of network traffic. As the technology
matured, its value for non-military research activities was recognized, and in
1973, the first medical-related research computer was added. to the network
(Shortliffe 1998a, 2000). During the 1980s, the technology began to be
developed in other parts of the world, and the National Science Foundation
took over the task of running the main high-speed backbone network in the
United States. Hospitals, mostly academic centers, began to be connected to
what was then known as the Internet, and in One major policy move it was
decided to allow commercial organizations to join the network as well. By April
1995, the Internet in the United States had become a fully commercialized
operation, no longer dependent on the US government. US to support even
major backbone connections. Today, the Internet is ubiquitous, accessible
through mobile wireless devices, and has provided the invisible but mandatory
infrastructure for social, political, financial, scientific, and entertainment
endeavors. Many people point to the Internet as a prime example of the
enabling role of federal investment in promoting innovative technologies. The
Internet is a major social media force that arguably would never have been
created if research and development, as well as the coordination of activities,
had been left to the private sector.
The explosive growth of the Internet did not occur until the late 1990s, when
the World Wide Web (which had been initially conceived by the physics
community as a way to use the Internet to share preprints with photographs
and diagrams among researchers) was introduced and popularized. Browsing
the Web is highly intuitive, requires no special training, and provides a
mechanism for accessing multimedia information that explains its remarkable
growth as a worldwide phenomenon. The social impact of this phenomenon
cannot be overstated, especially given the international connectivity that has
grown phenomenally over the past two decades.
Countries that were once isolated from information that was important to
citizens, ranging from consumers to scientists to policy makers, are now
finding new options for bringing timely information to the desktop, machines
and mobile devices of people with an Internet connection. At the same time,
there has been great upheaval in the telecommunications industry, with
companies that used to be in different businesses (e.g., cable television,
Internet and telephone services) now finding that their activities and
technologies have merged. In the United States, legislation was passed in
1996 to allow new competition to develop and new industries to emerge. We
have subsequently seen the fusion of technologies such as cable television,
telephone, networks, and satellite communications. High-speed lines in homes
and offices are widely available, Wireless networks are ubiquitous and
inexpensive mechanisms for connecting to the Internet. without using
conventional computers (for example, using mobile phones or set-top boxes)
have also emerged. The impact on everyone has been great. and is therefore
affecting the way individuals search for health-related information and is also
improving the way patients can access their healthcare providers and their
clinical data. Just as individual hospitals and healthcare systems have come to
appreciate the importance of integrating information from multiple clinical and
administrative systems within their organizations (see Chap. 14), health
planners and governments now appreciate the need to develop integrated
information resources that combine clinical and health data from multiple
institutions within regions and ultimately at the national level (see chaps. 13
and 16). As you can see, the Internet and therefore the role of digital
communications has become an important part of modern medicine and
health. Although this theme recurs in essentially every chapter of this book, we
introduce it in the following sections because of its importance to modern
technical issues and policy directions. regional and national registries, as well
as research databases that can support retrospective studies (see Chapter 11)
or formal institutional or community-based clinical trials (see Chap. 26). The
information analyzed from registries and research studies can in turn be used
to develop standards for prevention and treatment, with the
main direction of biomedical research. Researchers can extract information
directly from health records or from data aggregated into registries. Treatment
standards can in turn be translated into protocols, guidelines and educational
materials. This new knowledge and decision support functionality can be sent
over the network to physicians for information to inform patient care, where it
seamlessly integrates with EHRs and order entry systems. This notion of a
system that allows us to learn from what we do, unleashing experience that
has traditionally been stored unusably in paper charts, is gaining a lot of
attention now that we can imagine an interconnected community of physicians
and institutions, building digital data resources using EHRs. The concept has
been dubbed a learning health care system and is a subject of ongoing study
by the Institute of Medicine, which has published a series of reports on the
topic (IOM 2007; 2011; 2012).

1.2.1Implications of the Internet for patients

As Internet penetration continues to grow, it is no surprise that an increasing number


of patients, as well as healthy individuals, are turning to the Internet for health
information. It is a rare American doctor who has not encountered a patient who
arrives for an appointment armed with a question, or a stack of impressions, that
arose from medical-related searches on the Web. Companies that offer Internet
search engines report that health-related sites are among the most popular being
explored by consumers. As a result, physicians and other care providers must be
prepared to deal with information that patients discover on the Web and bring with
them when they seek care from clinicians. Some of the information is timely and
excellent; in this sense, physicians can often learn about innovations from their
patients and must be increasingly open to the kinds of questions that this improved
access to information will generate from patients in their practices. On the other hand,
much of the health information on the Web lacks peer review or is purely anecdotal.
People who lack medical training may be misled by such information, just as they
have been misled in the past by information printed in books and magazines
discussing fad treatments from anecdotal sources. In addition, some sites offer
personalized advice, sometimes for a fee, with all the attendant concerns about the
quality of the suggestions and the ability to give valid advice based on an email or
Web-based interaction. In a positive light, new communications technologies offer
physicians creative ways to interact with their patients and provide higher quality
care. Medicine adopted the telephone as a standard vehicle to facilitate patient care
years ago, and now we take this type of interaction with patients for granted. If we
extend the audio channel to include our visual sense as well, typically relying on the
Internet as our communication mechanism, the notion of telemedicine emerges (see
Chap. 18). This notion of “distance medicine” emerged at the beginning of the 20th
century (see Fig. 1.9), but the technology was too limited for much penetration of the
idea beyond telephone conversations until the last 30-40 years. Subsequently, the
use of telemedicine has grown rapidly, and there are specialized settings in which it is
already proving to be successful and cost-effective (e.g., rural care, international
medicine, teleradiology, and video-based care of patients in prisons).
1.2.2Requirements to achieve the vision

Efforts that continue to advance the state of the art in Internet technologies all have
significant implications for the future of health care delivery in general and EHRs and
their integration in particular (Shortliffe 1998b, 2000). But beyond increasing the
speed, reliability, security and availability of the Internet, there are many other areas
that need attention if the vision of a learning healthcare system is to be achieved.

1.2.2.1 Education and Training

There is a difference between computer literacy (familiarity with computers and their
routine uses in our society) and knowledge of the role that computing and
communications technology can and should play in our health care system. We are
generally doing a poor job of training future physicians in this latter area and are
therefore leaving them ill-equipped for the challenges and opportunities they will face
in the rapidly changing practice environments that surround them (Shortlife 2010).

Moreover, much of the vision for the future they have proposed here can be achieved
only if educational institutions produce a cadre of talented individuals who not only
understand computing and communications technology, but also have a deep
understanding of the biomedical environment and the needs of physicians and other
health care workers. Computer science training alone is not adequate. Fortunately,
we have begun to see the creation of formal training programs in what has become
known as biomedical informatics (see Sect. 1.4 ) that provide personalized
educational opportunities. Many of the trainees are life-long research scientists,
physicians, nurses, pharmacists, and other health care professionals who see
professional opportunities and challenges at the intersections of biomedicine,
information science, computer science, decision science, cognitive science, and
communication technology. As has been clear for more than two decades (Greenes
and Shortliffe 1990), however, demand for such individuals far outstrips supply, both
for academic and industrial career paths. 4.5 We need more training programmes, 6
expansion of those that already exist, as well as support for young teachers in health
science faculties who wish to pursue further training in this area.

1.2.2.2 Organization and Change Management


Second, as mentioned above, there needs to be greater understanding among
healthcare leaders regarding the role of specialized multidisciplinary expertise in the
clinical success of systems implementation. The health care system provides some of
the most complex organizational structures in society (Begun and Zimmerman 2003),
and it is simplistic to assume that off-the-shelf products will be seamlessly introduced
into a new institution without major efforts at analysis, redesign, and cooperative co-
development. Lack of investment and lack of understanding of the requirements for
process reengineering as part of software implementation, as well as problems with
technical direction and planning, account for many of the frustrating experiences that
healthcare organizations report about their efforts to use computers more effectively
in support of patient care and provider productivity. The notion of a learning health
system described above is intended to motivate your enthusiasm for what is coming
and to suggest the topics that should be addressed in a book. like this. Essentially, all
of the following chapters touch on some aspect of the vision.
of integrated systems that extend beyond institutions Before entering into these
issues, however, we must emphasize two points. First, the cyclical creation of new
knowledge in health system learning will become a reality only if individual hospitals,
academic medical centers, and national coordinating bodies work together to provide
the necessary standards, infrastructure, and resources. No individual, developer,
vendor, or system administrator can mandate the standards of connectivity, data
pooling, and data sharing that a learning healthcare system entails. A national
cooperative planning and implementation initiative for computing and
communications resources within and between institutions and clinics is required
before practitioners have routine access to the information they need (see Chap. 13).
A recent federal incentive program for EHR implementation is a first step in this
direction (see Sect. 1.3 ). The criteria that are required for successful EHR
implementation are sensitive to the need for data integration, public health support,
and a learning health care system program. Second, although our presentation of the
Learning of Health Care concept has focused on the physician's view of integrated
access to information, other workers in the field have similar needs that can be
addressed in similar ways. The research community has already developed and
made use of much of the technology that needs to be merged if the clinical user is to
have similar access to data and information. There is also the patient's point of view,
which must be considered in the notion of patient-centred health care that is now
widely accepted and encouraged (Ozkaynak et al. 2013).

1.3 The United States government intervenes

During the first decades of the evolution of clinical information systems for use in
hospitals, patients, and public health, the primary role of government was to support
the research enterprise as new methods were developed, tested, and formally
evaluated. The topic was rarely mentioned by the nation's leaders, however, even
during the 1990s when the White House was considered especially tech-savvy. It
was therefore notable when, in the President's State of the Union address in 2004
(and in each of the following years of his administration), President Bush called for
universal implementation of electronic health records within 10 years. Health and
Human Services Secretary Tommy Thompson also supported this, and in May 2004
created an entity intended to support expanded EHR use: the Office of the National
Coordinator for Health Information Technology (initially referred to by the full acronym
ONCH IT, but later abbreviated to simply ONC). There was a limited budget for the
ONC, although the organization served as a convening body for planning efforts
related to EHR and the National Health Information Infrastructure (see chaps. 12, 13
and 27). The topic of EHRs subsequently became a talking point for both major
candidates during the 2008 presidential election, with strong bipartisan support.
However, it was the American Recovery and Reinvestment Act
(ARRA) in early 2009, also known as the economic “Stimulus Act,” which first
provided significant funding to provide tax incentives for health systems, hospitals,
and providers to implement EHRs in their practices. Such payments were available,
however, only when eligible organizations or individual professionals implemented
EHRs that were "certified" as meeting minimum standard requirements and when
they could document them. were making “significant use” of those systems. You will
see references to this certification. and meaningful use criteria in many chapters in
this volume. There is also a discussion of HIT policy and the federal government in
Chap. 27 Although the EHR implementation process is still ongoing at present, the
trend is clear: because of the federal stimulus package, a large number of hospitals,
systems, and physicians are investing in EHRs and incorporating them into their
practices. In addition, the demand for workers trained in health information
technology has grown much faster than the labor market, even within health care
(Fig. 1.10). It is a striking example of how government policy and investment can spur
important transitions in systems such as health care, where many observers had
previously felt that progress had been unacceptably slow (Shortliffe 2005).

1.4 Definition of Biomedical Informatics and Related Disciplines


With the previous sections of this chapter as background, let us now consider the
scientific discipline that is the subject of this volume and has led to the development
of many of the functionalities that need to be brought together in the integrated
biomedical-informatics environment of the future. The remainder of this chapter deals
with biomedical informatics as a field and with biomedical and health information as a
topic of study. Provides additional background necessary to understand many of the
subsequent chapters in this book. Reference to the use of computers in biomedicine
evokes different images depending on the nature of one's involvement in the field.
For a hospital administrator, it might mean maintaining clinical care records using
computers; for a decision scientist, it might mean computer assistance in diagnosing
disease; for a basic scientist, it might mean using computers to maintain, retrieve,
and analyze gene sequencing information. Many physicians immediately think of
office practice tools for tasks such as patient billing or appointment scheduling.
Nurses often think of computers as tools to map the care they provide, or decision
support tools that help implement the most current patient care guidelines. The field
includes the study of all of these activities and a great many others as well. More
importantly, it includes consideration of various external factors that affect the
biomedical environment. Unless you keep these surrounding factors in mind, it can
be difficult to understand how biomedical computing can help us bridge the various
aspects of healthcare and its delivery. To achieve a unified perspective, we might
consider four related issues: (1) the concept of biomedical information (why it is
important in biological research and clinical practice and why we might want to use
computers to process it); (2) the structural features of medicine, including all those
subtopics to which computers can be applied; (3) the importance of evidence-based
knowledge of biomedical and health topics, including its derivation and appropriate
management and use; and (4) the applications of computers and communication
methods in biomedicine and the scientific issues underlying such efforts. We mention
the first two topics briefly in this and the next chapter, and provide references in the
Suggested Readings section for those students who wish to learn more. The third
theme, knowledge to support effective decision making in support of human health, is
intrinsic to this book and occurs in various forms in essentially every chapter. The
fourth topic, however, is the main theme of this book.
Computers have captured the imagination. (and attention) of our society. Today's
younger individuals have always lived in a world where computers are ubiquitous and
useful. Because the computer as a machine is exciting, people may pay a
disproportionate amount of attention to it as such, at the expense of considering what
the computer can do given the numbers, concepts, ideas, and cognitive foundations
of fields such as medicine, health care, and biomedical research. Computer
scientists, philosophers, psychologists, and other scholars are increasingly
considering questions such as the nature of information. and knowledge and how
human beings process such concepts. These investigations have given a sense of
opportunity (if not urgency) by the mere existence of the computer. The cognitive
activities of clinicians in practice have probably received more attention in the last
three decades than in all of previous history (see Chap. 4). Again, the existence of
the computer and the possibilities of expanding a physician's cognitive powers have
motivated many of these studies. To develop computer tools to assist with decisions,
we must more clearly understand human processes such as diagnosis, therapy
planning, decision making and problem solving in medicine. We must also
understand how personal and cultural beliefs affect the way information is interpreted
and decisions are ultimately made.

1.4.1Terminology
Since the 1960s, when increasing numbers of people doing serious biomedical
research or clinical practice had access to some form of computer system, people
have been unsure what name they should use for the biomedical application of
computer science concepts. The name computer science was new in 1960. and was
only vaguely defined. Even today, the term computer science is used more as a
matter of convention than as an explanation of the field's scientific content. In the
1970s we began using the phrase medical informatics to refer to the subdivision of
informatics that applies the methods of the broader field to medical subjects. As you
may wish to see, however, medicine has provided a rich area for computer science
research, and several computer science insights and methodologies have been
derived from applied medical computing research.

The term information science, occasionally used in conjunction with computer


science, originated in the field of library science and is used to refer, somewhat
generally, to the broad range of issues related to the management of both paper-
based and electronically stored information. Much of what information science
originally set out to be is now evolving interest under the name of cognitive science.
Information theory, by contrast, was first developed by scientists concerned with the
physics of communication; it has evolved into what can be viewed as a branch of
mathematics. The results that scientists have obtained with information theory have
illuminated many processes in communications technology, but they have had little
effect on our understanding of human information processing. The terms biomedical
computing or biocomputing have been used for several years. They are non-
descriptive and

neutral, implying only that computers are used for some purposes in biology or
medicine. They are often associated with bioengineering applications of computers,
however, where the devices are viewed more as tools for a bioengineering
application than as a primary focus of research.

In the 1970s, inspired by the French term for informatics, the English-speaking
community began using the term medical informatics. Those in the field were
attracted by the word's emphasis on information, which they saw as more central to
the field than the computer itself, and it gained momentum as a term for the
discipline, especially in Europe, during the 1980s. The term is broader than medical.
computing (including topics such as medical statistics, record keeping, and the study
of the nature of medical information itself) and de-emphasizes the computer while
focusing instead on the nature of the field in which the applied computations are
performed. Because the term informatics became widely accepted in the United
States only in the late 1980s, medical information science was also previously used
in North America; this term, however, can be confused with librarianship, and does
not capture the broader implications of the European term. As a result, the name
medical informatics emerged in the late 1980s and has become the preferred term,
even in the United States. In fact, this is the name of the field we used in the first two
editions of this textbook (from 1990 to 2000), and it is still sometimes used in
professional, industrial, and academic settings. However, many observers expressed
concern that the adjective "medical" is too physician-centric and fails to appreciate
the relevance of this discipline to other health services and life sciences
professionals. Thus, the term health informatics, or health care informatics, gained
some popularity, although it has the disadvantage that it tends to exclude applications
to biomedical research (chaps. 24 and 25) and, as I argued shortly, tends to focus the
field name on domains of application (clinical care, public health, and prevention)
rather than the basic discipline and its broad range of applicability. Applications of
computational methods in biology and genetics exploded during the 1990s due to the
human genome project7 and the growing recognition that modern life sciences
research was no longer possible without computational support and analysis (see
Chapters 24 and 25). By the late 1990s, the use of computer methods in such work
had become widely known as bioinformatics, and the director of the National
Institutes of Health (NIH)
appointed an advisory group called the Working Group on Biomedical Informatics. In
June 1999, the group provided a report 8 recommending that the NIH undertake an
initiative called the Biomedical Science and Technology Information Initiative (BISTI).

With the subsequent creation of another NIH organization called the Bioinformatics
Working Group, the visibility of computer applications in biology was greatly
enhanced. Today, bioinformatics is a major area of activity at the NIH 9 and at many
universities and biotechnology companies around the world. The explosive growth of
this field, however, has added to the confusion regarding the naming conventions we
have been discussing. Furthermore, the relationship between medical informatics and
bioinformatics became unclear. As a result, in an effort to be more inclusive and
embrace the biological applications with which many medical informatics groups had
already been involved, the name medical informatics gradually gave way to
biomedical. computer science (IMC). Several academic groups have changed their
names, and a major medical informatics journal (Computers and Biomedical
Research) was reborn in 2001 as The Journal of Biomedical Informatics. 10 Despite
this convoluted naming history, we believe that the wide range of problems in
biomedical information management requires an appropriate name, and, beginning
with the third edition of this book (2006), we use the term biomedical informatics for
this purpose. It needs to become the most widely accepted term for a core discipline
and should be seen to broadly encompass all areas of application in health, clinical
practice and biomedical research. When we talk specifically about computers and
their use within biomedical informatics activities, we use the terms biomedical
informatics (for methodological reasons) or biomedical computing (to describe the
activity itself). Keep in mind, however, that biomedical informatics has many other
component sciences besides computer science. These include decision sciences,
statistics, cognitive science, information science, and even management sciences.
We return to this point shortly when we discuss the basic versus applied nature of the
field when viewed as a basic research discipline. Although labels like these are
arbitrary, they are by no means insignificant. In the case of new fields of activity or
branches of science, they are important both in designating the field and in defining
or restricting its content. The most distinctive feature of the modern computer is the
generality of its application. The almost unlimited range of computer uses
complicates
the matter of naming the field. As a result, the nature of computer science is perhaps
best illustrated by examples rather than by attempts at formal definition. Much of this
book presents examples that do exactly this for biomedical informatics as well. The
American Medical Informatics Association (AMIA), which was founded in the late
1980s under the discipline's former name, has acknowledged the confusion regarding
the field and its definition. 11 They accordingly appointed a task force to develop a
definition of the field and to specify the core competencies that should be acquired by
students seeking graduate training in the discipline. The resulting definition, published
in the AMIA Diario magazine and approved by the organization's Plenary, identifies
the focus of the field in a simple sentence and then adds four clarifying corollaries
that refine the definition and the scope and content of the field (Table 1.1). We adopt
this definition, which is very similar to the one we offered in previous editions of this
text. This recognizes that the emergence of informatics biomedicine as a new
discipline is largely due in part to rapid advances in computer and communications
technology, a growing awareness that the knowledge base of biomedicine is
essentially unmanageable by traditional paper-based methods, and a growing
conviction that the process of informed decision-making is as important to modern
biomedicine as is the collection of facts on which clinical decisions are based or
research plans are developed.

1.4.2Historical Perspective

The modern digital computer emerged from developments in the United States and
abroad during World War II, and general-purpose computers began to appear on the
market in the mid-1950s (Fig. 1.11). Speculation about what could be done with such
machines (if it was ever reliable) had begun, however, much earlier. Scholars, at
least since the Middle Ages, had often asked the question of whether human
reasoning can be explained in terms of formal or algorithmic processes. Gottfried
Wilhelm von Leibnitz, a 17th-century German philosopher and mathematician,
attempted to develop a calculus that could be used to simulate human reasoning.
The notion of a “motor logic” was further developed by Charles Babbage in the mid-
19th century. The first practical application of automatic computing relevant to
medicine was that of Herman Hollerith. development of a punch card data processing
system for the US census. US from 1890 (Fig. 1.12). His methods were soon adapted
to epidemiology and public health surveys, ushering in the era of electromechanical
punched card data processing technology, which matured and was widely adopted
during the 1920s and 1930s. These techniques were the precursors of the stored
program and fully electronic digital computers, which began to appear in the late
1940s (Collen 1995). One of the first activities of biomedical computing was the
attempt to build systems that would assist a physician in decision making (see Chap.
22). Not all biomedical computing
However, the programs followed this course. Many of the early ones instead
investigated the notion of a comprehensive hospital information system (HIS; see
Chap. 14). These projects were perhaps less ambitious in the sense that they were
more concerned with short-term practical applications; the difficulties they
encountered, however, were
still fearsome. The earliest work on HIS in the United States was probably that
associated with General Electric's MEDINET project, followed by work at Bolt,
Beranek, Newman in Cambridge, Massachusetts, and then at Massachusetts
General Hospital (MGH) in Boston. Barnett and
its partners for three decades from
early 1960s. Work on similar systems was done by Warner at the Latter-day Saints
(LDS) Hospital in Salt Lake City, Utah, by Collen at Kaiser Permanente in Oakland,
California, by
Weiderhold at Stanford University in Stanford, California, and by scientists at
Lockheed in Sunnyvale, California. 12
The course of SIS applications branched into
the 1970s One approach was based on the concept of an integrated or monolithic
design in
that a single large, shared computer over time would be used to support an entire
collection of applications. An alternative was a distribution

design that favored the separate implementation of specific applications on smaller


individuals

computers—minicomputers—thus allowing the independent evolution of systems in


the respective application areas. A common assumption was the existence of a single
shared database
patient information. The multi-machine model
However, it was not practical until network technologies enabled fast and reliable
communication between distributed devices and (sometimes)
heterogeneous types of machines. Such distributed HIS began to appear in the
1980s (Simborg et al. 1983).
Biomedical-informatics activity expanded in scope and accelerated with the advent of
the minicomputer in the early 1970s. These machines made it possible for individual
departments or small organizational units to acquire their own dedicated computers
and develop their own application systems (Fig. 1.13). In conjunction with the
introduction of general-purpose software tools that provided standardized facilities to
people with limited computer training (such as the UNIX operating system and
programming environment), the minicomputer put more computing power into the
hands of more people.
biomedical researchers than any other development until the introduction of the
microprocessor, a central processing unit (CPU) contained on one or a few chips
(Fig. 1.14).
Everything changed radically at the end
1970s and early 1980s, when the microprocessor
and the personal computer (PC) or microcomputer became available. Not only could
hospital departments purchase minicomputers, but now individuals could also
purchase microcomputers.
This change greatly expanded the computing base in our society and gave rise to a
new software industry.
The first articles about computers.
In medicine it had appeared in clinical journals in the late 1950s, but it was not until
the late 1970s that the first use of computers in computer-related advertisements
aimed at physicians began to appear (Fig. 1.15). Within a few years, a wide range of
information management software tools were available as
products; their descriptions began to appear in newspapers alongside traditional
advertisements
for medicines and other medical products. Nowadays, individual physicians find it
practical to employ PCs.
in a variety of configurations, including for applications
in patient care or clinical research. The stage is now set with a wide range of
hardware of various sizes, types, prices and capabilities, all of which will continue to
evolve in the coming decades. The trend—reductions in size and cost of computers
with simultaneous
increases in power (Fig. 1.16)—shows no signs of slowing down, although scientists
foresee ultimate physical limitations to the miniaturization of computer circuits. 13
Advances in biomedical-informatics research will continue to be tied to the availability
of funding from government or commercial sources. Because most biomedical
computing research is exploratory and far from ready for commercial application, the
federal government has played a key role in funding the work over the past four
decades, primarily through the NIH and the Agency for Healthcare Research and
Quality (AHRQ). The National Library of Medicine (NLM) has taken on a leading role
in biomedical informatics, especially with support for research in the field (Fig. 1.17).
As the number of applications that prove profitable increases, more development
work is likely to be required. will shift to industrial settings and that university
programs will increasingly focus on fundamental research problems seen as too
speculative for near-term commercialization.

1.4.3Relationship with Biomedical


Science and Clinical Practice

The exciting achievements of biomedical informatics, and the potential for future
benefits to medicine, must be seen in the context of our society and the existing
healthcare system. As early as 1970, an eminent physician suggested that computers
might eventually have a revolutionary influence on medical care, medical education,
and even the selection criteria for health science students (Schwartz 1970).
The subsequent huge growth in computing activity has been greeted with some
concern from the health sector. professionals ask where it will all end. Willpower

Will healthcare workers be gradually replaced by computers? Will nurses and doctors
need to be highly trained in computer science or informatics before they can practice
their professions effectively?
Both patients and health care workers eventually
rebel rather than accept a trend toward automation that they believe may threaten the
traditional
of possible areas of application (Fig. 1.19). He
analogy with other basic sciences is that biomedical informatics uses the results of
past experience
understand, structure and codify objectives and
subjective biomedical findings and thus make
they are suitable for processing. This approach supports the integration of findings
and their analyses. In turn, the selective distribution of new
The knowledge created can aid patient care, health planning, and basic biomedical
research.
Biomedical informatics is, by its nature, a
Experimental science, characterized by asking questions, designing experiments,
performing analysis, and using the information obtained
to design new experiments. One goal is simply to seek new knowledge, called
background knowledge.
investigate. A second objective is to use this knowledge
for practical purposes, called (applied) applications
investigate. There is a continuity between these
two efforts (see Fig. 1.19). in biomedical
computing, there is a particularly close coupling
Among the areas of application, broad categories
of which are indicated at the bottom of Fig. 1.19, and the identification of the basic
research tasks that characterize the scientific foundations of the field. Research,
however, has shown that there can be a very long time lag between the development
of new concepts and methods in basic research and their eventual application in
the biomedical world (Balas and Boren 2000).
Furthermore (see Fig. 1.20), many discoveries
are discarded along the way, leaving only a small
percentage of basic research discoveries that
have a practical influence on health and care
of patients
Work in biomedical informatics (BMI) is
intrinsically motivated by the problems encountered in a set of applied domains in
biomedicine. The first of these has historically been clinical care.
(including medicine, nursing, dentistry and veterinary care), an area of activity that
demands
patient-oriented computer applications. Us
refer to this area as clinical informatics. That
includes several subtopics and areas of specialized expertise, including patient care
focuses such as
Nursing Informatics, Dental Informatics and
including veterinary informatics. Furthermore, the discipline's former name, medical
informatics, is now reserved for applied research and practice topics that focus on
disease and the role of physicians. As discussed above, the term “medical
informatics” is no longer used to refer to the discipline as a whole.
Closely linked to clinical informatics is public health informatics (Fig. 1.19), where
similar methods are generalized for application to patient populations rather than to
individual individuals (chap. 16). Thus, clinical informatics and public health
informatics share many of the same methods and techniques. Two other major areas
of application overlap in some ways with clinical informatics and public health
informatics. These include image informatics (and the set of issues developed around
radiology and other image management and image analysis domains such as
pathology, dermatology, and molecular visualization—see Chaps. 9 and 20). Finally,
there is the burgeoning area of bioinformatics, which at the molecular and cellular
level is offering challenges that rely on many of the same computational methods as
well (see Chap. 24).
As shown in Fig. 1.21, there is a spectrum as one moves from left to right across
these IMC application domains. In bioinformatics, workers deal with molecular and
cellular processes in the application of computer methods. At the next level, workers
focus on tissues and organs, which tend to be the emphasis of imaging informatics
work (also called structural informatics in some institutions). Moving into clinical
informatics, the focus shifts to individual patients and eventually to public health,
where researchers address population and societal problems. The core science of
biomedical informatics has important contributions to make across that spectrum, and
many computational methods are broadly applicable across the same range of
domains.
Note in Fig. 1.19 Biomedical informatics and bioinformatics are not synonyms and it
is incorrect to refer to the scientific discipline as bioinformatics, which is, rather, an
important
area of application of BMI methods and concepts. Similarly, the term health
informatics, which refers to applied research and practice in clinical and public health
informatics, is also not a proper name for the core discipline, since BMI is also
applicable to basic human biology. in terms of health. boundaries and many areas of
applied computing research involve more than one of the categories. For example,
biomolecular imaging involves both bioinformatics and image informatics. concepts.
Similarly, consumer health informatics (see Chapter 17) includes elements of both
clinical informatics and public health informatics.
Another important area of BMI's research activities is pharmacogenomics (see Chap.
25), which
is the effort to infer the genetic determinants of
human response to drugs. Such work requires the
analysis of linked genotypes and phenotypes
databases, and is therefore at the intersection of bioinformatics and clinical
informatics.
In general, BMI researchers draw their inspiration from one of the application areas,
identifying fundamental methodological issues that need to be addressed and testing
them on prototype systems or, for more mature methods, on real systems.
systems that are used in clinical applications or biomedical research scenarios. An
important implication of this view is that the core discipline is identical regardless of
the area of application that a
a certain individual is motivated to address, although
Some BMI methods have greater relevance for
some domains than others. This advocates for unified BMI educational programs,
which provide
bring together students with a wide variety of application interests. Elective courses
and internships in
Specific interest areas are, of course, important to complement the core exposures
that students should receive, but, given the need for teamwork and understanding in
the field, separating learners based on the application areas they may be interested
in would be counterproductive and wasteful. 14
BMI's scientific contributions can also
be appreciated for their potential to benefit the education of health professionals
(Shortliffe 2010). For example, in education

of medical students, the various cognitive functions The activities of physicians have
traditionally tended to be considered separately and in isolation—they have largely
been treated as if they were independent and distinct modules of performance. An
activity that is attracting increasing interest is formal medical decision making (see
Chap. 3). The specific content of this area remains to be fully defined, but the
discipline's reliance on formal methods and its use of knowledge and information
reveal it to be one aspect of biomedical informatics. A particular topic in the study of
medical decision making is diagnosis, which is often conceived and taught as if it
were an independent activity. Medical students may be led to view diagnosis as a
process that physicians carry out in isolation before choosing therapy for a patient or
moving on to other modular tasks. A number of studies have shown that this model is
oversimplified and that such a decomposition of cognitive tasks can be quite
misleading (Elstein et al. 1978; Patel and Groen 1986). Doctors seem to be dealing
with several tasks at the same time. Although a diagnosis may be one of the first
things doctors think about when they see a new patient, patient assessment
(diagnosis, management, analyzing treatment outcomes, monitoring disease
progression, etc.) is a process that is never really over. A doctor must be flexible and
open-minded. It is usually appropriate to modify the original diagnosis if it turns out
that treatment based on it is unsuccessful or if new information weakens the evidence
supporting the diagnosis or suggests a second, concurrent disorder. These issues
are discussed in more detail in Chapter 4. When we talk about making a diagnosis,
choosing a treatment, managing a therapy, making decisions, monitoring a patient, or
preventing a disease, we are using labels for different aspects of health care, an
entity that has an overall unit. The fabric of healthcare is a continuum in which these
elements are closely intertwined. Regardless of whether we consider computer
science and informatics as a profession, a technology or a science, there is no doubt
about their importance to biomedicine. We can assume that computers will be used
more and more in clinical, biomedical practice.
research and education in health sciences.

1.4.4 Relationship with the computer


Science
During its evolution as an academic entity in universities,
computer science followed an unstable path. course as involved
teachers attempted to identify key themes in the field and to find
the discipline’s organizational place. Lots of IT
programs were located in the departments of
electrical engineering, because the main concerns of its
researchers were computer architecture and the design and
development of practical hardware components. At the same
time, computer scientists were interested in programming
languages and software, ventures that are not particularly
characteristic of engineering. Moreover, his work with algorithm
design, computability theory,15 and other theoretical topics
seemed
more related to mathematics. Biomedical informatics is based
on all the
these activities: hardware development, software and computer
theory. biomedical
Computing in general has not had a big enough impact
market to influence the course of major hardware
developments; that is, computers have not been developed
specifically for biomedical applications. Not since the early
1960s (when health informatics experts occasionally talked
about and, in some cases, developed special medical terminals)
have people assumed that biomedical applications would use
hardware other than that
designed for general use. The question of whether biomedical
applications would require specialized programming languages
could have been answered affirmatively in the 1970s by anyone
examining the MGH Multiple Utility Programming System,
known as the MUMPS language (Greenes et al. 1970; Bowie
and Barnett 1976), which was especially
developed for use in medical applications. For several years,
MUMPS was the most widely used
language for processing medical records. Underneath its new
name, M, it is still in widespread use. New implementations
have been developed for each generation of computers. M,
however, like any programming language, is not equally useful
for all computing tasks. In addition, the software requirements of
medicine are better understood. and they no longer seem to be
unique; rather, they
are specific to the type of task. A scientific computing program
looks more or less the same whether it is designed for chemical
engineering or pharmacokinetic calculations.
So how is BMI different from biomedical? Computer Science? Is
the new discipline simply
the study of computer science with a "biomedical"? If you go
back to the definition of biomedical informatics that we provided
in Sect.
1.4.1 ,and then refer to Fig. 1.19, we believe will make you
begin to see why biomedical informatics is more than simply the
biomedical application of informatics. The problems it addresses
are not
only have broad relevance to health, medicine, and biology, but
the underlying sciences in
that BMI professionals draw are also inherently interdisciplinary
(and not limited to computer science topics). For example,
Successful BMI research will often be based on, and contribute
to, informatics, but may also
be closely related to decision sciences (probability theory,
decision analysis or the psychology of human problem solving),
science, information sciences or management sciences (Fig.
1.22). In addition, a biomedical informatics researcher will be
closely linked to some underlying problem in reality.
world of health or biomedicine. As Figure 1.22 illustrates, for
example, a biomedical informatics basic researcher or PhD
student will typically be motivated by one of the application
areas, such as those shown at the bottom of Figure 1.21, but a
PhD-worthy dissertation in the field will generally be identified
by a generalizable scientific result that also contributes to one of
the component disciplines (Fig. 1.22) and upon which scientists
can build in the future.
1.4.5 Relationship with biomedical engineering
If the IMC is a relatively young discipline, biomedical
engineering is an older and better established discipline. one.
Many engineering and medical schools have formal academic
programs in the latter subject, often with departmental status
and full-time faculty. Only in the last 2 decades or so has this
begun to be true of academic biomedical informatics units. How
does biomedical informatics work?
relate to biomedical engineering, especially in an era where
engineering and computer science are increasingly intertwined?
Biomedical engineering departments emerged 40 years ago,
when technology began to play an increasingly prominent role
in medical practice. 16 The emphasis on such departments
has tended to be instrumentation research and development
(for example, as discussed in Chaps.
19 and 20, advanced monitoring systems, specialized
transducers for clinical or laboratory use and image
enhancement techniques for use in radiology), with an
orientation towards
the development of medical devices, prosthetics and
specialized research tools. Over there
There is also a major emphasis on tissue engineering and wet
bench research efforts.
In recent years, computer techniques have been used both in
the design and construction of medical devices and in the
medical devices themselves. For example, “smart” devices in
most medical specialties increasingly rely on computer
technology. Intensive care monitors that generate blood
pressure records while calculating average values and hourly
summaries are examples of such
“Smart” devices. The overlap between biomedical engineering
and the IMC suggests that it would be unwise for us to
compulsively draw strict boundaries between the two camps.
There are ample opportunities for interaction, and there are
chapters in this book that clearly overlap with biomedical
engineering topics—e.g. e.g., Cap. 19 on patient monitoring
systems and Chap. 20 on radiology systems. Even where they
meet, however, the fields have differences in emphasis that can
help you understand their different evolutionary histories. In
Biomedical engineering, the emphasis is on medical devices; at
BMI, the emphasis is on biomedical information and knowledge
and its
administration using computers. In both fields, the computer is
secondary, although both use computer technology. The
emphasis in this book is on the informatics end of the
biomedical informatics spectrum, so it does not
spend a lot of time studying biomedical engineering topics.

1.5 The nature of medicine

Information
From the above discussion, one might conclude that biomedical
applications do not pose any problems.
any unique problem or concern. In contrast, the biomedical
environment poses several
topics that, interestingly, are quite distinct from those found in
most other domains of applied computing. The clinical
information seems
be systematically different from information used in physics,
engineering or even clinical
chemistry (which is more like general chemical applications than
medical applications)
some). Aspects of biomedical information include an essence of
uncertainty—we can never know everything about a
physiological process, and this
gives rise to inevitable variability between individuals. These
differences pose special problems and

Some researchers suggest that biomedical informatics differs


from conventional informatics.
science in a fundamental way. We will explore these differences
only briefly here; for details, one can consult Blois's book on this
topic (see Suggested Readings).
Let's examine an example of what we are going to
call a low-level (or easily formalized) science.
Physics is a natural starting point; in any discussion of
hierarchical relationships between
sciences (from the 4th century BC Greek philosopher C.
Aristotle to the 20th century American).
librarian Melvil Dewey), physics will be placed
near the bottom Physics characteristically has a
a certain kind of simplicity or generality. The concepts and
descriptions of objects and processes in physics, however, are
necessarily used
in all applied fields, including medicine. The laws of physics and
the descriptions of certain
types of physical processes are essential to represent or explain
functions that we consider as
of a medical nature. We need to know something
on molecular physics, for example, to understand why water is
such a good solvent; to explain
how nutrient molecules are metabolized, we talk about the role
of electron transfer reactions. Applying a computer (or any
formal calculation) to a physical problem in a medical context
It is no different than doing it in a physics lab or for an
engineering application. The use of computers in various low-
level processes (such as those in physics or chemistry) is
similar and
It is application independent. If we are talking about the solvent
properties of water, it makes
It doesn't matter if we are working in geology, engineering or
medicine. Such
Low-level physical processes are particularly receptive to
mathematical treatment, so using computers for these
applications requires only conventional numerical programming.
In biomedicine, however, there are others
higher-level processes carried out in more complex objects,
such as organisms (one type of which are patients). Much of the
important information processes are of this type. When we
discuss, describe, or record the properties or behavior of human
beings, we are using very high-level descriptions of objects,
whose behavior has no counterpart in physics or engineering.
The person who uses computers to analyze the descriptions of
these high-level objects and processes encounters serious
difficulties (Blois 1984).
One might object to this line of argument by pointing out that,
after all, computers are routinely used in commercial
applications in which human beings and the situations that
concern them are involved and the relevant calculations are
carried out successfully. The explanation is that, in these
commercial applications, descriptions of human beings and their
activities have been so highly abstract that events or processes
have been reduced to low-level objects. In biomedicine,
abstractions taken to this degree would be useless from a
clinical or research perspective.
For example, one instance of a human being in the banking
business is the customer, who can deposit, borrow, withdraw or
invest money. To describe business activities like these, we
need only a few properties; the customer can still be an abstract
entity. In clinical medicine, however, we could not begin to treat
a patient represented with such few abstractions. We must be
prepared to analyze the most complex behaviors that human
beings display and to describe patients as completely as
possible. We must deal with the rich descriptions that occur at
the high levels of the hierarchy,
and we may have difficulty encoding and processing this
information using the tools of mathematics and computer
science that work so well
at low levels. In light of these comments, the general company
known as artificial intelligence
(AI) can aptly be described as the application of computer
science to high-level, real-world problems.
Biomedical informatics therefore includes computer applications
ranging from processing very low-level descriptions, which are
little different from their counterparts in physics, chemistry, or
engineering, to processing extremely complex ones.
high-level ones, which are completely and systematically
different. When we study humans
beings in their entirety (including such aspects
such as human cognition, self-awareness, intentionality, and
behavior), we must use these high-level descriptions. We will
discover that they increase
complex issues to which conventional logic and
mathematics is less easily applicable. In general, attributes of
low-level objects appear
sharp, crisp, and unambiguous (e.g., “length,” “mass”), while
high-level ones tend to
be soft, confusing and inaccurate (e.g., “unpleasant smell,”
“good”). Just as we need to develop different methods to
describe high-level objects, the inference methods we use with
such objects may differ from those we use with low-level ones.
In formal logic, we start with the assumption that a given
proposition must be either true or false. This feature is essential
because logic is concerned with the preservation of truth value
under various formal transformations. However, it is difficult or
impossible to assume that all propositions have truth values
when we are dealing with the many high-level descriptions in
medicine or, indeed, in everyday situations. Questions like "Was
Woodrow Wilson a good president? cannot be answered with a
"yes" or "no" (unless we limit the question to specific criteria for
determining the goodness of presidents). Many common
questions in biomedicine have the same property 1.6
Biomedical Integration Informatics and Clinical Practice
It should be clear from the above discussion that biomedical
informatics is a remarkably broad and complex topic. We have
argued that information management is intrinsic to clinical
practice and that interest in using computers to aid in
information management has grown over the past five decades.
In this chapter and throughout the book, we emphasize the
myriad ways in which computers are used in biomedicine to
ease the burdens of information processing and the means by
which new technology promises to change the delivery of health
care. The extent to which such changes are realized, and their
rate of occurrence, will be determined in part by external forces
that influence the costs of developing and implementing
biomedical applications and the ability of scientists, physicians,
patients, and the health care system to accrue the potential
benefits.
We can summarize several global forces that
are affecting biomedical computing and will determine the
extent to which computers are assimilated into clinical practice:
(1) new developments in computer hardware and software; (2) a
gradual increase in the number of
people who have received training in both medicine or another
health profession and in IMC;
and (3) continuing changes in health care financing
designed to control the rate of growth of health-related
expenditures. We touched on the first of these factors in Sect.
1.4.2, when we describe the historical development of
biomedical computing and the trend from mainframe computers,
to microcomputers and PCs, and to the mobile devices of today.
The vision of the future outlined in Sect. 1.1 similarly builds on
the influence the Internet has provided to society as a whole
over the past decade. New hardware technologies have made
powerful computers inexpensive and available to hospitals,
departments within hospitals, and even individual physicians. Its
wide selection of computers of all sizes, prices, and capabilities
makes computer applications attractive and accessible.
Technological
Advances in information storage devices,17 including the
movement of files to the “cloud,” are facilitating the economical
storage of large amounts of data, thereby improving the viability
of data-intensive applications such as the all-digital radiology
department discussed in Chap.
20. Hardware standardization and advances in network
technology are making it easier to share data and integrate
related information management functions within a hospital or
other health care organization.
Computers are becoming more and more prevalent in all
aspects of our lives, whether as an ATM, as the
microprocessor in a microwave oven, or like a phone that takes
pictures and shares them wirelessly with others. Doctors trained
in recent years may have used computer programs
learn diagnostic techniques or manage the
simulated patient therapy. They may have learned to use a
computer to search for medical literature, either directly or
with the help of

a specially trained librarian. Simple exposure to computers does not, however,


guarantee an eagerness to embrace the machine. Clinical staff
will continue to be unwilling to use computer systems that are poorly designed,
confusing, time-consuming, or lack clarity of benefit (see Chapters 4 and 6). As
they become more sophisticated in using computers in other aspects of their
lives, their expectations of clinical software will only become more

demanding.
The second factor is the increase in the number of professionals who are being
trained for
understand biomedical problems, as well as the
technicians and engineers. Computer scientists who understand biomedicine
are more capable
design systems that respond to real needs and are sensitive to workflow and
clinical culture.
Health professionals who receive formal training
At BMI they are likely to build systems using well-established techniques while
avoiding the past.
mistakes from other developers. As more professionals are trained in the
special aspects of both
fields, and as the programs they develop are introduced, health care
professionals are more likely to have useful and usable systems available when
they turn to the computer for help with
information management tasks.
The third factor affecting the integration of
Information technology in healthcare settings
It is managed care and the increasing pressure to
control medical spending. The growing trend to apply technology to all patient
care
tasks is a frequently cited phenomenon in modern medical practice. Mere
physical findings do not
Longer ones are considered suitable for making
diagnosis and treatment planning. In fact, medical students who are taught by
more experienced physicians to find subtle diagnostic signs by examining
various parts of the body, however
They often choose to bypass or downplay physical exams in favor of ordering
one test after another. Sometimes they do it without paying.
sufficient attention to the resulting cost. Some new ones
Technologies replace less expensive, but technologically inferior, tests. In such
cases, the use of

The more expensive approach is generally justified. Sometimes computer-


related technologies have allowed us to perform tasks that were previously not
possible. for example, scans produced by computed tomography or magnetic
resonance imaging (see chaps. 9 and 20) have allowed physicians to view cross-
sections of the body for the first time, and medical instruments in intensive
care units continuously monitor patients' body functions that previously could
only be checked episodically (see Chap. 19). However, the development of
expensive new technologies, and the belief that more and better technology
was needed, helped drive the rapid rise in health care costs of the 1970s and
1980s, leading to the introduction of managed care and capitation — changes
in financing and delivery that were designed to curb spending in the new era of
cost consciousness. Integrated computer systems potentially provide the
means to capture data for detailed cost accounting, to analyze the relationship
of the costs of care to the benefits of that care, to assess the quality of care
provided, and to identify areas of inefficiency. Systems that improve will clearly
improve the quality of care while reducing the cost of providing that care. The
effect of cost containment pressures on technologies that increase the cost of
care while improving quality is less clear. Medical technologies, including
computers, will be accepted only if they improve the delivery of clinical care
while either reducing costs or providing benefits that clearly exceed their costs.
Improvements in hardware and software make
computers best suited for biomedical applications.
However, medical system designers must successfully address
many logistical issues and engineering questions before
computers can be
fully integrated into medical practice. For example, are
computers conveniently located? Ought
Mobile devices replace connected workstations
from the past? Can users complete their tasks without
excessive delays? Is the system reliable?
enough to prevent data loss? Can users interact?
easily and intuitively with the computer? Is patient data secure
and properly protected from prying eyes? In addition, cost
control pressures are producing a growing reluctance to adopt
expensive technologies that add to the high cost of medical
care. The net effect of these opposing trends will largely
determine the extent to which computers continue to be
integrated into the health care environment.
In summary, rapid advances in computing hardware and
software, coupled with increasing computer literacy among
healthcare professionals and researchers, are facilitating the
implementation of effective computing applications in clinical
practice, public health, and life sciences research. Furthermore,
in the increasingly competitive
In the industryIn the healthcare industry, providers have an
increased need for the information management capabilities
provided by computer systems. The challenge is to demonstrate
persuasively and rigorously the financial and clinical advantages
of these systems (see Chap. 11).

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy