0% found this document useful (0 votes)
128 views16 pages

Cognitive Bias

This document summarizes a research paper that examines how cognitive biases may occur in an auditor's use of data visualization and how that could compromise audit quality. Specifically, it analyzes whether and how five major cognitive biases (framing, availability, overconfidence, anchoring, and confirmation biases) may be triggered or exacerbated by data visualization techniques. While data visualization has potential to improve audit efficiency and effectiveness, cognitive biases stemming from its use need to be properly addressed to ensure visualization enhances rather than hinders auditors' judgment and decision-making. The document provides an overview of the research paper's purpose, methodology, findings, and practical implications.

Uploaded by

comusic2103
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
128 views16 pages

Cognitive Bias

This document summarizes a research paper that examines how cognitive biases may occur in an auditor's use of data visualization and how that could compromise audit quality. Specifically, it analyzes whether and how five major cognitive biases (framing, availability, overconfidence, anchoring, and confirmation biases) may be triggered or exacerbated by data visualization techniques. While data visualization has potential to improve audit efficiency and effectiveness, cognitive biases stemming from its use need to be properly addressed to ensure visualization enhances rather than hinders auditors' judgment and decision-making. The document provides an overview of the research paper's purpose, methodology, findings, and practical implications.

Uploaded by

comusic2103
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

The current issue and full text archive of this journal is available on Emerald Insight at:

https://www.emerald.com/insight/0268-6902.htm

Cognitive
Data visualization and cognitive biases in
biases in audits audits
Chengyee Janie Chang and Yan Luo
Charles W. Lamden School of Accountancy, San Diego State University,
San Diego, California, USA 1
Received 31 August 2017
Revised 24 May 2018
Abstract 5 October 2018
1 December 2018
Purpose – This paper aims to examine major cognitive biases in auditors’ analyses involving visualization, Accepted 31 December 2018
as well as proposes practical approaches to address such biases in data visualization.
Design/methodology/approach – Using the professional judgment framework of KPMG (2011), this
study performs an analysis of whether and how five major types of cognitive biases (framing, availability,
overconfidence, anchoring and confirmation) may occur in an auditor’s data visualization and how such
biases potentially compromise audit quality.
Findings – The analysis suggests that data visualization can trigger and/or aggravate the common
cognitive biases in audit. If not properly addressed, such biases may adversely affect auditors' judgment and
decision-making.
Practical implications – To ensure that data visualization improves audit efficiency and effectiveness, it
is essential that auditors are aware of and successfully address cognitive biases in data visualization. Six
practical approaches to debias cognitive biases in auditors’ visualization are proposed: using data
visualization to complement rather than supplement traditional audit evidence; positioning data visualization
to support rather than replace sophisticated analytics tools; using a dashboard with multiple dimensions;
using both visualized and tabular data in analyses; assigning experienced audit staff; and providing pre-audit
tutorials on cognitive bias and visualization.
Originality/value – The study raises awareness of psychological issues in an audit setting.
Keywords Cognitive bias, Audit
Paper type Research paper

1. Introduction
In the age of Big Data, stakeholders in capital markets (e.g. managers, investors, auditors,
government agencies, etc.) are increasingly overloaded with electronic information.
Auditing standards require auditors to obtain an understanding of an audit entity and its
environment to assess the risk of material misstatements in an audit engagement (AU
Section 314). The sheer amount of information generated by audited entities’ ever-increasing
and diverse use of information technologies challenges auditors’ ability to make effective
and efficient audit decisions. Technology-enabled data visualization has great potential to
improve audit efficiency and effectiveness by transforming large complex datasets into
high-level compact graphic representations of the data that can facilitate both the discovery
and communication of valuable and latent patterns. However, these benefits can only be
achieved if auditors are capable of mitigating the potential cognitive biases in data
visualization and are willing to subject the insights gained from visualization to more
sophisticated statistical analysis (Sloman, 1996; Lurie and Mason, 2007; Cao et al., 2015;
Hirsch et al., 2015; Paddrick et al., 2016). Managerial Auditing Journal
It has been demonstrated that the human brain can process more information at a faster Vol. 36 No. 1, 2021
pp. 1-16
rate when the information is presented graphically than when it is presented in text or © Emerald Publishing Limited
0268-6902
tabular formats (Lurie and Mason, 2007). Graphic displays allow auditors, especially novice DOI 10.1108/MAJ-08-2017-1637
MAJ auditors, to quickly comprehend the data, detect certain relevant patterns and identify
36,1 trends and relationships that would take much more effort to identify if presented in tables
(Vessey and Galletta, 1991; Singh and Best, 2016). Thus, visualization can be used in every
phase of an audit, including decisions to accept or continue an audit engagement, audit
planning, risk assessment, responding to risk assessments and tests of controls or
substantive testing (AICPA, 2015).
2 Currently, visualization is being used by some auditors to generate insights, increase the
accuracy of conclusions and improve the efficiency and effectiveness of the auditing process
(Singh et al., 2013; AICPA, 2015; Brown-Liburd et al., 2015; Cao et al., 2015; Singh and Best,
2016)[1]. Data visualization can be used to both explore and explain data (Alawadhi, 2015).
In exploratory data visualization, visualization tools (e.g. scatterplots, trend lines, bubble
charts, tables, etc.) are used before and during the process of gathering and evaluating audit
evidence to explore data relationships from various perspectives, discover new and
meaningful patterns and detect discontinuities, exceptions and outliers that might be
concealed in the operational and financial data of an audit client (Lurie and Mason, 2007;
Brown-Liburd et al., 2015; Paddrick et al., 2016). Exploratory data visualization can also be
used to analyze a whole population of transactions rather than a sample, which may make
some audit procedures more effective, as patterns, trends or anomalies in a population are
not always detectable when the procedures are applied to samples (Alles and Gray, 2014;
Brown-Liburd et al., 2015). Although more sophisticated statistical tools should be applied to
the data before any conclusions are drawn, preliminary exploratory visualization is useful
when the datasets are very large or if the specific nature and relationships of the data are not
clearly predefined. In contrast, explanatory data visualization usually occurs after auditors
have analyzed the data using more established, sophisticated tools (e.g. regression analysis).
It is commonly used to synthesize and communicate auditors’ main findings to convince the
viewers of the auditors’ conclusions and facilitate decision-making (Arunachalam et al.,
2002; Eppler and Aeschimann, 2009; Fisher, 2010; Alawadhi, 2015; Appelbaum et al., 2017).
In conclusion, exploratory data visualization techniques are used to analyze the data,
whereas explanatory data visualization techniques are used to communicate the results of
the analyses.
Prior research investigating the effects of visualization on decision-making in the context
of auditing suggests that data visualization enhances auditor judgment and audit quality
and can benefit audit procedures, such as accepting or continuing an audit engagement, risk
assessment and analytical procedures (Blocher et al., 1986; Kogan et al., 2014; Hirsch et al.,
2015; Cao et al., 2015). However, to the best of our knowledge, prior studies have not
examined the potential negative effects of visualization on decision-making in auditing. In
particular, it is unclear whether visualization’s effects on cognitive biases impact the
efficiency and effectiveness of audit procedures. This study explores whether and how
cognitive bias is triggered or aggravated by data visualization and how such biases
potentially compromise audit quality.
Cognitive bias refers to the tendency of individuals to make systematic mistakes in
judgment when making decisions (Kahneman and Tversky, 1972), and it has been identified
and studied in behavioral audit research (Trotman et al., 2011). To ensure that audit
decisions are based on relevant and trustworthy evidence, it is critical to understand how
any cognitive biases triggered or aggravated by explanatory and exploratory data
visualization might affect the process of gathering and evaluating audit evidence and thus
potentially compromise audit quality.
This study examines whether and how data visualization can trigger and/or aggravate
the five types of cognitive bias that are most relevant in auditing contexts: framing,
availability, overconfidence, anchoring and confirmation biases, specifically identified the Cognitive
KPMG (2011) professional judgment framework[2]. Prior research (Joyce and Biddle, 1981; biases in
Lurie and Mason, 2007) shows that these biases can be operationalized through the
manipulation of reference points, the format of the graphic presentation, or the vividness
audits
and evaluability of the data. Based on the insights from audit practitioners and regulators as
well as from extant psychology and behavior research (Benbasat and Dexter, 2986;
Hammersley, 2006; Lurie and Mason, 2007; KPMG, 2011; Singh and Best, 2016; Rose et al.,
2017), six approaches to debiasing auditors’ visualization analyses are identified: using data 3
visualization to complement rather than supplement traditional audit evidence; positioning
data visualization to support rather than replace sophisticated analytics tools; using a
dashboard with multiple dimensions; using both visualized and tabular data in analyses;
assigning experienced audit staff; and providing pre-audit tutorials on cognitive bias and
visualization.
This study has important implications for audit practitioners, standard setters, educators
and researchers. First, it identifies key cognitive biases that potentially constrain the
effectiveness of auditors’ use of data visualization in the audit process and identifies six
approaches that may mitigate these biases. Second, it suggests that audit standard setters
should consider adjusting auditing standards to clarify the role of data visualization and
should provide specific guidelines on how to use data visualization tools to perform audits
and on how to interpret and document audit evidence collected from visualization. Third, it
highlights the need to include visualization and potential cognitive biases in the auditing
curriculum. Last but not least, this study is a response to the call for research on both the
positive and negative effects of Big Data analytics (including visualization) on auditor
judgment and to investigate solutions/approaches that might mitigate any negative impact
on auditor judgment (Brown-Liburd et al., 2015).
The remainder of this paper is organized as follows. Section 2 defines the five common
cognitive biases in an audit setting, discusses how visualization may trigger or aggravate
specific biases and demonstrates how such biases impede the effectiveness and efficiency of
audit procedures, decision-making and judgment. Section 3 proposes techniques for
mitigating the cognitive biases that may occur when auditors use visualization methods.
The final section considers this study’s academic, regulatory and practical implications and
the challenges in applying visualization methods to audits.

2. Data visualization and cognitive bias


“A picture is worth a thousand rows of data” (Lurie and Mason, 2007, p. 160). Prior research
suggests that visual transformations of data affect the insights derived from the data and
impact both the decision processes and outcomes (Bettman and Kakkar, 1977; Lurie and
Mason, 2007). For example, a heat map can help users to recognize patterns and identify
outliers. Both bar-charts and pie charts can not only make it easier to observe data
distributions but also make it more difficult to make inferences about trends (Kobsa, 2001;
Lurie and Mason, 2007; Jääskeläinen and Roitto, 2016). It seems that data visualization tools
select, transform and present data in visual formats to facilitate the exploration and
understanding of the data and convert data into insights (Green, 1998; Lurie and Mason,
2007); they have the potential to help auditors to make better, faster and more confident
decisions and to improve audit efficiency and effectiveness. However, studies have shown
that graph preparers can strategically manipulate seemingly minor formatting options (e.g.
including or omitting horizontal grid lines, presenting the data in a certain chronological
order, omitting negative values) to affect the graph users’ impressions of the data, which
may in turn affect their predictions of the future value of time-series data or their assessment
MAJ of trends (Lawrence and O’Connor, 1993; Arunachalam et al., 2002). Such improperly
36,1 designed graphs are commonly used by audit clients, even in their annual reports (Johnson
et al., 1980; Beattie and Jones, 1992), potentially triggering cognitive bias. Such graphs are
widely recognized as having a persistent and typically adverse impact on auditors’ decision
processes (Knapp and Knapp, 2012). Auditors’ capacity to maximize the benefits of data
visualization and improve the efficiency and effectiveness of the audit process depends on
4 how well they recognize and mitigate potential cognitive biases. The following sections
review how visualizations can trigger or aggravate five common cognitive biases: the
framing effect; availability bias; overconfidence bias; anchoring bias; and confirmation bias.

2.1 Data visualization and the framing effect


Frames are “mental structures that decision makers use, usually subconsciously, to
simplify, organize and guide their understanding of a situation; these frames shape their
perspectives and determine what information they see as relevant or irrelevant, important or
unimportant” (KPMG, 2011, p. 16). As with optical illusions, individuals’ responses to a
problem vary according to the way the problem is framed (Kahneman and Tversky, 1979;
Fagley, 1993; Chang et al., 2002), a bias known as the framing effect. Previous studies have
shown that the framing effect can affect audit procedures, such as searching or evaluating
evidence, and auditors’ judgments (Libby, 1985; Ayers and Kaplan, 1993; Asare and Wright,
2003; KPMG, 2011).
In their study of the framing effects of data visualizations in political messaging and
decision-making, Hullman and Diakopoulos (2011, p. 2231) suggest that visualization
techniques that “prioritize particular interpretations [. . .] that ‘tell a story’ can significantly
affect end-user interpretation.” In an audit setting, O’Clock and Devine (1995) document the
differential effects of positive and negative framing of information on auditors’ evidence
collection and evaluation, and on their assessment on the client’s going concerns. For
example, a graphic representation (e.g. a pie chart) highlighting that 30 per cent (70 per cent)
of a client’s usual trade credits from suppliers are denied (awarded) might impact auditors’
assessment of the probability and severity of their audit client’s financial difficulties, which
is one of the key conditions of an entity’s ability to continue as a going concern (AU 341.06
An Entity’s Ability to Continue as Going Concern, Consideration of Conditions and Events).
Due to the framing effect, auditors who receive or process the negatively framed information
(credit denial) are more likely to have substantial doubt about the client’s ability to continue
as a going concern. Therefore, improperly designed visualizations can trigger and/or
aggravate framing effects during an audit, and clients can use visualization tools
strategically to convey an intended story at an early stage of an audit, which may impact
subsequent information acquisition, evaluation and interpretation, and even lead to different
audit decisions. Auditors are subject to framing effects not only when they use visualization
tools to explore data but also when they read client-created, improperly designed or
impression-inducing graphs (Arunachalam et al., 2002).

2.2 Data visualization and availability bias


Availability bias refers to the tendency to use and trust information that is easily accessible
and to consider such information more relevant and more important than evidence that is
more difficult to obtain (Tversky and Kahneman, 1973). In audit settings, availability bias
can cause auditors to focus on readily available alternatives or information (COSO, 2012),
such as information presented visually, which may then unduly influence estimates,
probability assessments and other professional judgments.
Shanteau (1989) suggests that availability bias is influenced by imaginability, Cognitive
familiarity, vividness and evaluability. Data visualization directly improves at least two of biases in
these features, vividness and evaluability (Lurie and Mason, 2007) and thus might trigger or
aggravate availability bias and cause auditors to ignore other (relevant) information that is
audits
not presented visually (Glazer et al., 1992; Mandel and Johnson, 2002). Vividness refers to the
salience or availability of specific information (Nisbett and Ross, 1980). More vivid visual
information is likely to be processed before less vivid visual information (Jarvenpaa, 1990).
Visualization tools are likely to affect vividness simply by presenting data in a form that 5
uses pre-attentive graphic features, such as line orientation, width, length and color, that can
be processed with little effort (Julesz, 1981; Treisman, 1985; Healey et al., 1995). Thus, in
general, visualized information is likely to receive a greater weight than information
presented in text format (e.g. numbers) (Stone et al., 1997). Moreover, certain types of visual
representations are likely to be more vivid than others. In particular, shapes and colors that
“pop” because they are unique, contrast sharply with other data, or have the greatest
variation in size, have greater salience to human information processors (Benbasat and
Dexter, 1985; Simkin and Hastie, 1987; Jarvenpaa, 1990; Lurie and Mason, 2007). As a result,
more vivid information is likely to be more heavily weighted in auditors’ decision-making.
In addition, visualization improves the evaluability of the data (Lurie and Mason, 2007),
which may also trigger the availability bias in audits. Evaluability refers to the ease with
which information can be accessed and compared (Hsee, 1996). By making it easier to
compare information, visualization tools enable decision makers (e.g. auditors) to notice
changes, recognize outliers, detect trends and see patterns more quickly (Lurie and Mason,
2007). Making information easier to compare is likely to lead to increased acquisition,
weighting and processing of this information (Bettman and Kakkar, 1977; MacGregor and
Slovic, 1986; Hsee, 1996; Jarvenpaa, 1989, 1990). As a result, more evaluable information is
likely to be more heavily weighted in an auditor’s decision-making.
The availability bias in auditor information acquisition and processing may ultimately
compromise audit quality. Auditors operating under availability bias are less likely to invest
appropriate time and effort into considering different alternatives, to properly weight the
alternatives in terms of how well they meet the objective, or to consider the reliability,
validity, certainty and accuracy of information (COSO, 2012). More importantly, availability
bias could “dampen the professional skepticism that auditors should invoke during every
audit engagement and can render them less likely to uncover a fraudulent scheme
perpetrated by a client” (Knapp and Knapp, 2012, p. 41).

2.3 Data visualization and overconfidence bias


Overconfidence bias refers to an auditor’s tendency to “overestimate their own ability to
perform tasks or to make accurate diagnoses or other judgments and decisions” (KPMG,
2011, p. 25). Overconfidence bias can lead to suboptimal behavior at every step in the
judgment process and can compromise auditor objectivity. It can be disastrous when a false
sense of security leads an auditor to underinvest in defining the problem or identifying
fundamental objectives, abbreviate information gathering, consider too few alternatives or
truncate/skip information (COSO, 2012; Tang et al., 2013; Fay and Montague, 2014).
In general, decision makers tend to have more confidence in information presented in a
graphic format than in a numeric format (Amer, 1991). In an experimental study in a
financial decision-making context, Tang et al. (2013) find that data visualization increases
decision makers’ confidence without a corresponding increase in decision accuracy; thus,
visualizations can potentially aggravate decision makers’ overconfidence bias. In audit
settings, this can lead to problems in audit procedures such as failure to acquire adequate
MAJ audit evidence (e.g. considering contradictory evidence) and failure to sufficiently review
36,1 subordinates’ work (Kennedy and Peecher, 1997; KPMG, 2011; CAQ, 2014).

2.4 Data visualization and anchoring bias


In their examination of human information processing, Tversky and Kahneman (1974)
report that anchoring is an important cognitive bias. “Anchoring describes the phenomenon
6 that a given stimulus affects later judgments in the direction of the previous judgment, even
if both stimuli are completely unrelated” (Valdez et al., 2018, pp. 585-586). Behavioral
accounting research (Joyce and Biddle, 1981; Kinney and Uecker, 1982; Northcraft and Neale,
1987; Knapp and Knapp, 2012) has demonstrated that decision makers tend to fixate on their
initial estimate or expectation instead of sufficiently adjusting away from their initial anchor
as they progress through the evidence-collection process. In auditing, the anchoring effect
often occurs when an auditor places too much reliance on one piece of information or set of
circumstances (CAQ, 2014). According to Knapp and Knapp (2012), auditors are prone to the
anchoring bias, particularly when auditing accounting estimates such as valuations for
accounts receivable, inventory and loan portfolios. In such cases, auditors tend to anchor on
a company’s preaudit estimates for those accounts (such as allowance for inventory
obsolesce, allowance for doubtful accounts or likelihood and magnitude of contingent
liability) and easily accept the preaudit account balances as reasonable. As a result,
management’s initial estimates or preaudit numbers become powerful anchors that
unknowingly influence auditors’ judgment and decisions (COSO, 2012; CAQ, 2014).
More importantly, anchoring bias triggered and/or aggravated by data visualization
might affect the subsequent evaluations of the insights gained from the data (Valdez et al.,
2018). Using scatterplots, Valdez et al. (2018) examine the effects of anchoring and find that
the participants’ judgments on whether two clusters in a plot are separable are affected by:
(1) the distance between the clusters; and
(2) the distance between previously seen clusters.

Hullman and Diakopoulos (2011) point out that data visualization can anchor the users’
interpretations, which are most likely to be formed according to the dimensions of the data
used in the visualization analysis (e.g. the default view in a data visualization tool)[3]. This
reduces decision makers’ (e.g. auditors’) attention to other aspects of the data and the
likelihood that auditors will challenge the reasonableness of management’s estimations or
interpretations, especially when they are re-confirmed by other charts that appear when
users simply click on “view more charts” in the visualization tools (e.g. Tableau or Power BI)
without changing the dimensions being analyzed. Thus, visualizations can lead auditors to
overly rely on preliminary analyses, resulting in suboptimal audit procedures and final
judgments that are close to the client’s preferred initial direction/trend (KPMG, 2011; CAQ,
2014).

2.5 Data visualization and confirmation bias


Confirmation bias refers to “the tendency to seek and overweight confirming information in
the information gathering and evaluation steps, and to favor conclusions that are consistent
with initial beliefs or preferences. The confirmation tendency can bias a wide variety of
auditor judgments, ranging from an auditor only seeking evidence that is consistent with
client’s explanation for an unusual pattern in financial data, to placing disproportionate
weight on audit evidence that is consistent with a preferred outcome.” (Glover and Prawitt,
2013, p. 11).
Visual impressions are instantaneously processed in about 50 ms (Lindgaard et al., 2006). Cognitive
Data visualization allows vast amounts of data to be interpreted through intuitive, visual biases in
perceptions that do not require mental, numerical processing (Phillips et al., 2014). This
results in a higher likelihood of confirmation bias, especially when the users focus on the
audits
subset of the data visualization that confirms their preconceived notions. This bias creates
barriers to seeking and using disconfirming information (Phillips et al., 2014). Indeed, Aruna
et al. (2010) show that when experiment participants are provided with information in the
7
form of a visualization, they exhibit more confirmation bias (i.e. discuss fewer hypotheses
and persist with poor hypotheses) than if the information is presented without
visualizations. Such visualization-based confirmation bias may lead auditors to
intentionally seek patterns that reaffirm preexisting explanations and/or ignore
contradictory patterns, resulting in less effective audit procedures and failure to challenge
the reasonableness of unusual patterns in audit clients’ data. For example, if auditors
initially consider a client’s explanation for an unusually low reserve for sales returns or
allowance for a doubtful account as reasonable, they might selectively filter out evidence
demonstrating that the client’s reserve is much lower than the industry average and/or
disproportionately value the data that show the client’s estimate is relatively consistent with
some peer companies, even if the latter patterns only occur in the subset of the data, such as
certain product lines or business segments (Bendoly, 2016). As a result, data visualization
can trigger or even aggravate auditors’ confirmation bias and negatively impact the
effectiveness of audit procedures.

3. Mitigating cognitive bias in data visualization


Cognitive bias is a critical issue for auditors, and the interactions between cognitive bias and
data visualization have not been fully understood. Drawing upon the insights discussed
above, this study proposes six recommendations for mitigating cognitive bias in
visualization: using data visualization to complement rather than supplement traditional
audit evidence; positioning data visualization to support rather than replace sophisticated
analytics tools; using a dashboard with multiple dimensions; using both visualized and
tabular data in analyses; assigning experienced audit staff to design visualization
procedures; and providing pre-audit tutorials on cognitive bias in visualization.
First, auditors should use data visualization to complement the evidence collected from
traditional substantive audit procedures. Specifically, in addition to using data visualization
in exploring the data by discovering meaningful patterns and detect discontinuities,
exceptions and outliers, data visualization could also be used again after the data have been
examined with more traditional auditing procedures (KPMG, 2012; Rose et al., 2017). In an
experimental study, Rose et al. (2017, p. 82) find that auditors who reexamine the analysis
through data visualizations after they have reviewed the results of preliminary analytical
procedures (i.e. traditional audit evidence) are more capable of differentiating relevant vs
irrelevant patterns, express more concerns about misstatement and are more skeptical of
patterns that contradict the evidence collected from traditional audit procedures.
The second proposal to mitigate cognitive bias in the use of visualization is built on the
idea that both explanatory data visualization and exploratory data visualization should play
a supporting role in analysis, and they should never replace sophisticated analytics/
statistical tools. Auditors should mainly use explanatory data visualization to communicate
the findings that are based on more established, sophisticated statistical analyses (e.g.
regression analysis, factor analysis) (Fisher, 2010; Alawadhi, 2015). Auditors should use
exploratory data visualization only to identify general relationships, patterns, trends,
MAJ anomalies and outliers; such findings should be discussed with experienced auditors and
36,1 investigated using sophisticated data analysis tools before any conclusions are drawn.
The third proposal to mitigate cognitive bias in the use of visualization is the adoption of
a dashboard that uses multiple visualized presentations to examine different dimensions of
the data set. When conducting an audit, it is critical to apply various views/dimensions to
the data set to reveal possible different stories (KPMG, 2015). One of the main advantages of
8 using visualizations in audit is to gain the insights from data in various contexts through
graphical presentations (Singh and Best, 2016). Knapp and Knapp (2012, p. 45) point out that
“the quality of problem solving decisions is enhanced when decision makers are required to
identify multiple explanations for the source or cause of a given problem.” Using a design
science approach, Singh and Best (2016) demonstrate that the use of “dashboards” to create
a multi-view visualization of various indicators (often called dimensions in visualization
tools) related to the same key measurement (such as sales revenue) may enhance the
efficiency and effectiveness of auditors who are attempting to detect anomalous and
potentially fraudulent transactions in high volume accounting transactions. Such a
dashboard allows auditors to simultaneously analyze the data through various lenses. It
highlights alternative actions, generates evidence from different perspectives and even
makes opposing cases, all of which help auditors to overcome their cognitive biases (CAQ,
2014). Appendix shows an example of a dashboard from AICPA (2015): a visual
representation of multiple different analyses of the variation in sales data under different
contexts including the location of convenience stores, whether the convenience store sells
gas, unit sales/square foot compared with the benchmark sales amount provided by the
National Association of Convenience Stores and the correlation between sales and
employees conditional upon whether the store sells gas. Dashboards presenting multiple
visualizations of the data set can challenge an auditor’s current cognitive frame and boost
their professional skepticism (KPMG, 2011). They can be particularly important in
exploratory data visualization because they present auditors with a range of approaches or
alternative explanations that can be tested using more rigorous statistical methods.
The fourth proposal to mitigate cognitive bias in the use of visualization is to include
both visualized and tabular data in analyses, which might improve an auditor’s detection of
errors and fraud. Some visualization tools (such as Tableau or PowerBI) allow users to
retrieve specific data values from the visualized graphs. Displays that combine both tabular
and visualized data lead to better decision-making and more efficient error detection than
either visualized information or tabular displays alone (Benbasat and Dexter, 1986; Lurie
and Mason, 2007). That is, although data visualizations are likely to be helpful for detecting
trends, comparing patterns and interpolating values, tabular representations are superior
for retrieving the specific data values used in making judgments (Benbasat and Dexter,
1985, 1986; Jarvenpaa and Dickson, 1988; Vessey and Galletta, 1991).
The fifth technique for mitigating auditors’ cognitive bias, assigning senior auditors to
design the data visualization strategy, draws upon behavior audit research on the
experience effect (Bonner and Walker, 1994; Libby et al., 2002). Senior auditors can design
the data visualization components of an audit program to better achieve the objectives of the
audit procedure by, for example, determining the desired attributes of the data used in the
data visualizations. Having both senior and junior auditors perform and review the data
visualization tasks may also help to mitigate cognitive bias in data visualization during an
audit engagement. Lurie and Mason (2007) suggest that the effect of visualizations on
decisions is likely to depend on users’ ability to recognize which factors are important, their
willingness to engage in more cognitive effort and/or their use of variant visualization
approaches to debias their information acquisition and processing and their decision-
making. Consistent with this observation, Dilla et al. (2010, p. 31) point out that Cognitive
“visualization that may be useful for experienced decision makers knowledgeable in biases in
accounting domain may result in inefficient or inaccurate decisions for less knowledgeable
users.” This proposition has been verified by a number of studies (Bettman and Kakkar,
audits
1977; Lurie and Mason, 2007; Brown-Liburd et al., 2015). Experienced auditors are typically
skilled at challenging the frames provided by clients and do not readily adopt the given
frames in presenting financial (or non-financial) data. Experienced auditors apply this
ability in situations where they need to help client management see an alternative viewpoint 9
on a critical accounting issue. Hence, it is critical to ensure that junior auditors work with
experienced auditors when generating and evaluating visualized data.
A sixth proposal for mitigating cognitive bias is to incorporate training on cognitive bias
into orientation programs of public accounting firms. Brief preaudit tutorials demonstrating
the impact of cognitive biases on decisions made on the basis of visualization tools may
effectively minimize their impact on auditors’ judgments. It is important to design preaudit
tutorials to make auditors more conscious of potential cognitive biases when they perform
both exploratory and explanatory visualization analyses in audits.

4. Discussion and conclusion


Data visualization potentially improves the efficiency and effectiveness of the processes
auditors use to gather and evaluate evidence; however, visualization can be a mixed
blessing. To capitalize on the capacity of data visualization to enhance auditors’ ability to
derive insights from data and to make better fact-based decisions (Bowtell et al., 2014),
auditors need to be fully aware of the potential cognitive biases inherent in the process. To
overcome the cognitive limitations potentially encountered when using data visualization to
acquire, process and evaluate evidence and to make decisions, auditors need to apply
sophisticated analytics tools and draw upon insights from traditional audit evidence (such
as analytical procedures and substantive audit procedures).
This study has important implications for audit practitioners, standard setters,
educators and researchers. First, this study suggests that the benefit of data
visualization in facilitating audits and/or enhancing audit efficiency depends on how
well the auditors manage or mitigate the potential cognitive biases. Second, this study
increases audit practitioners’ awareness of potential cognitive biases in their use of
data visualization tools for evidence gathering and evaluation and promotes the careful
selection and timing of appropriate visualization methods for various audit tasks. To
ensure that audit judgments and decisions are based on sufficient, relevant and reliable
information, it is extremely important for audit practitioners and audit firms to address
the cognitive biases that can arise from integrating data visualization into the audit
process. This study identifies six measures for dealing with potential cognitive biases.
Although it is unlikely that cognitive biases can be entirely eliminated, a better
understanding of their nature can help auditors to recognize situations in which their
judgment might be biased and to mitigate the potential negative effects of cognitive
biases on decision-making. The integrative implementation of the recommended
debiasing tools into the processes for collecting and interpreting audit evidence will
improve the effectiveness of technology-enabled data visualization in an audit.
For standard setters, the awareness that cognitive biases in data visualization might
compromise audit quality, even cause audit failure, indicates a need to adjust auditing
standards to clarify the importance of sophisticated analytics tools and traditional audit
evidence, and the need for specific guidelines for performing audits based on evidence
collected using data visualization tools. This is a critical issue, as due to the amount of data
MAJ generated by clients’ ERP systems, large public accounting firms have required auditors to
36,1 use visualization tools. As discussed in Murphy and Tysiac (2015)[4], Martin Baumann, the
PCAOB’s chief auditor and director of professional standards, said in a video interview that
regulators need to ensure that auditing standards facilitate technological improvements in
auditing rather than serving as an obstacle to progress in this area. Appropriate guidance in
auditing standards would encourage and support the implementation of data visualization
10 in audit procedures in a way that improves risk assessment and evidence gathering and
evaluation.
The importance of incorporating training in visualization skillsets into the accounting
curriculum has been acknowledged by audit educators. It is critical that when teaching
visualization tools, audit educators immediately raise students’ awareness of potential
cognitive biases in visualization. This will help new graduates to avoid decision traps and to
maximize data visualization’s potential to improve the efficiency and effectiveness of audit
procedures, enhance the collection and interpretation of audit evidence and improve auditor
decision-making.
For researchers, this study responds to the call for research to examine both the positive
and negative effects of Big Data analytics (including visualization) on auditor judgment and
to investigate approaches that might mitigate any negative impacts on auditor judgment
(Brown-Liburd et al., 2015). This study suggests that the potential negative impacts of
visualization on audit decision-making are seriously under-recognized, and it proposes
several testable propositions on how visualization triggers or aggravates specific cognitive
biases (framing, availability, overconfidence, anchoring and confirmation) in an audit
setting. It also considers whether and how expertise (skills and/or experience) can moderate
such biases. Future research might consider designing experiments that test these
hypotheses in a laboratory setting.
Overall, maximizing the capacity of new technologies such as data visualization to use
Big Data to improve audit efficiency and effectiveness will take the long-term joint efforts of
audit firms, educators, standard setters, regulators, professional bodies and solution
providers.

Notes
1. Visualization is widely used in other areas of the business community including accounting
(Dilla et al., 2010; Hirsch et al., 2015), supply chain management (Bendoly, 2016), performance
measurement systems (Jääskeläinen and Roito, 2016), strategic decision-making (Biloslavo et al.,
2012), financial investment (Tang et al., 2013) and marketing (Lurie and Mason, 2007).
2. Specifically, KPMG (2011, p. 23) suggest that their professional judgment framework covers five
common “tendencies that are most applicable and important for audit professionals [. . .] purpose
is to illustrate that the tendencies are common and that the related biases affect all of us.”
3. A dimension provides context/reference information about a business measurement, such as
revenue or cost of goods sold. Revenue by customer, revenue by division, or revenue by year
are examples of viewing the same key business measurement using different dimensions.
4. For example, Murphy and Tysiac (2015) mention that AICPA has established audit data standards
(ADS) to identify key data elements (e.g. naming, formatting, and levels of data fields) and tables/files
needed for data retrieval. These ADS provide a common framework for organizing data for external
audits. The AICPA audit data standards are available at www.aicpa.org/interestareas/frc/
assuranceadvisoryservices/pages/auditdatastandardwor kinggroup.aspx (accessed 30 November
2018).
References Cognitive
Alawadhi, A. (2015), “The application of data visualization in auditing”, PhD dissertation, Rutgers biases in
University.
audits
Alles, M. and and Gray, G. (2014), “Developing a framework for the role of big data in auditing: a
synthesis of the literature”, working paper, Rutgers Business School.
Amer, T. (1991), “An experimental investigation of multi-cue financial information display and decision
making”, Journal of Information Systems, Vol. 5 No. 2, pp. 18-34. 11
American Institute of Certified Public Accountants (AICPA) (2015), Audit Analytics and Continuous Audit:
Looking toward the Future, AICPA, New York, NY, available at: www.aicpa.org/interestareas/frc/
assuranceadvisoryservices/downloadabledocuments/auditanalytics_lookingtowardfuture.pdf (accessed
28 August 2017).
Appelbaum, D., Kogan, A. and Vasarhelyi, M.A. (2017), “Big data and analytics in the modern audit
engagement: research needs”, Auditing: A Journal of Practice and Theory, Vol. 36 No. 4, pp. 1-27.
Aruna, D., Balakrishnan, S.R., Fussell, S.K. and Kittur, A. (2010), “Pitfalls of information access with
visualizations in remote collaborative analysis”, Proceedings of the 2010 ACM Conference on
Computer Supported Cooperative Work (CSCW'10), ACM, New York, NY, pp. 411-420, available
at: http://doi.acm.org/10.1145/1718918.1718988
Arunachalam, V., Pei, B.K.W. and Steinbart, P.J. (2002), “Impression management with graphs: effects
on choices”, Journal of Information System, Vol. 16 No. 2, pp. 183-202.
Asare, S.K. and Wright, A.M. (2003), “A note on the interdependence between hypothesis generation
and information search in conducting analytical procedures”, Contemporary Accounting
Research, Vol. 20 No. 2, pp. 235-251.
Ayers, S. and Kaplan, S.E. (1993), “An examination of the effect of hypothesis framing on auditors’
information choices in an analytical procedure task”, Abacus, Vol. 29 No. 2, pp. 113-130.
Beattie, V. and Jones, M.J. (1992), “The use and abuse of graphs in annual reports: theoretical
framework and empirical study”, Accounting and Business Research, Vol. 22 No. 88, pp. 291-303.
Benbasat, I. and Dexter, A.S. (1985), “An experimental evaluation of graphical and colorenhanced
information presentation”, Management Science, Vol. 31 No. 11, pp. 1348-1364.
Benbasat, I. and Dexter, A.S. (1986), “An investigation of the effectiveness of color and graphical
information presentation under varying time constraints”, MIS Quarterly, Vol. 10 No. 1, pp. 59-83.
Bendoly, E. (2016), “Fit, bias, and enacted sensemaking in data visualization: frameworks for
continuous development in operations and supply chain management analytics”, Journal of
Business Logistics, Vol. 37 No. 1, pp. 6-17.
Bettman, J.R. and Kakkar, P. (1977), “Effects of information presentation format on consumer
information acquisition strategies”, Journal of Consumer Research, Vol. 3 No. 4, pp. 233-240.
Biloslavo, R., Kregar, T.B. and Gorela, K. (2012), “Using visualization for strategic decision making: a
case of Slovenian entrepreneurs”, Proceedings of the 13th European Conference on Knowledge
Management, Vol. 1, pp. 83-92.
Blocher, E., Moffie, R.P. and Zmud, R.W. (1986), “Report format and task complexity: interaction in risk
judgments”, Accounting, Organizations and Society, Vol. 11 No. 6, pp. 457-470.
Bonner, S.E. and Walker, P.L. (1994), “The effects of instruction and experience on the acquisition of
auditing knowledge”, The Accounting Review, Vol. 69 No. 1, pp. 157-178.
Bowtell, J., Danson, F., Gonnella, N. and and Steiger, M. (2014), “Data analytics and workforce
strategies: new insights for performance improvement and tax efficiency”, White Paper 12,
Deloitte.
Brown-Liburd, H., Issa, H. and Lombardi, D. (2015), “Behavioral implications of big data’s impact on
audit judgment and decision making and future research directions”, Accounting Horizons,
Vol. 29 No. 2, pp. 451-468.
MAJ Cao, M., Chychyla, R. and Stewart, T. (2015), “Big data analytics in financial statement audits”,
Accounting Horizons, Vol. 29 No. 2, pp. 423-429.
36,1
Center for Audit Quality (CAQ) (2014), “Professional judgment resource”, available at: www.thecaq.org/
docs/reports-and-publications/professional-judgment-resource.pdf?sfvrsn=4 (accessed 23 May
2018).
Chang, C.J., Yen, S.H. and Duh, R.-R. (2002), “An empirical examination of competing theories to explain
the framing effect in accounting-related decisions”, Behavioral Research in Accounting, Vol. 14
12 No. 1, pp. 35-64.
Committee of Sponsoring Organizations of the Treadway Commission (COSO) (2012), “Enhancing
board oversight: avoiding judgment traps and biases”, available at: www.coso.org/documents/
coso-enhancingboardoversight_r8_webready%20(2).pdf
Dilla, W., Janvrin, D.J. and Raschke, R. (2010), “Interactive data visualization: new directions for
accounting information systems research”, Journal of Information Systems, Vol. 24 No. 2,
pp. 1-37.
Eppler, M.J. and Aeschimann, M. (2009), “A systematic framework for risk visualization in risk
management and communication”, Risk Management, Vol. 11 No. 2, pp. 67-89.
Fagley, N.S. (1993), “A note concerning reflection effects versus framing effects”, Psychological Bulletin,
Vol. 113 No. 3, pp. 451-452.
Fay, R.G. and Montague, N.R. (2014), “Witnessing your own cognitive bias: a compendium of classroom
exercises”, Issues in Accounting Education, Vol. 30 No. 1, pp. 13-34.
Fisher, D. (2010), “Animation for visualization: opportunities and drawbacks”, in Steele, J. and Iliinsky,
N. (Eds), Beautiful Visualization: Looking at Data through the Eyes of Experts, O’Reilly,
Sebastopol, pp. 329-352.
Glazer, R., Steckel, J.H. and Winer, R.S. (1992), “Locally rational decision making: the distracting effect
of information on managerial performance”, Management Science, Vol. 38 No. 2, pp. 212-226.
Glover, S.M. and Prawitt, D.F. (2013), “Enhancing auditor professional skepticism”, available at: www.
researchgate.net/publication/258419768_Enhancing_Auditor_Professional_Skepticism (accessed
23 May 2018).
Green, M. (1998), “Toward a perceptual science of multidimensional data visualization: Bertin and beyond”,
available at: https://pdfs.semanticscholar.org/b5fd/6166cf2264100a403e1fda019d9e9c5c6303.pdf
(accessed 23 May 2018).
Hammersley, J.S. (2006), “Pattern identification and industry specialist auditors”, The Accounting
Review, Vol. 81 No. 2, pp. 309-336.
Healey, C.G., Booth, K.S. and Enns, J.T. (1995), “Visualizing real-time multivariate data using
preattentive processing”, ACM Transactions on Modeling and Computer Simulation, Vol. 5
No. 3, pp. 190-221.
Hirsch, B., Seubert, A. and Sohn, M. (2015), “Visualisation of data in management accounting reports:
How supplementary graphs improve every-day management judgments”, Journal of Applied
Accounting Research, Vol. 16 No. 2, pp. 221-239.
Hsee, C.K. (1996), “The evaluability hypothesis: an explanation for preference reversals between joint
and separate evaluations of alternatives”, Organizational Behavior and Human Decision
Processes, Vol. 67 No. 3, pp. 247-257.
Hullman, J. and Diakopoulos, N. (2011), “Visualization rhetoric: framing effects in narrative
visualization”, IEEE Transactions on Visualization and Computer Graphics, Vol. 17 No. 12,
pp. 2231-2240.
Jarvenpaa, S.L. (1989), “The effect of task demands and graphical format on information processing
strategies”, Management Science, Vol. 35 No. 3, pp. 285-303.
Jarvenpaa, S.L. (1990), “Graphic displays in decision making—the visual salience effect”, Journal of
Behavioral Decision Making, Vol. 3 No. 4, pp. 247-262.
Jarvenpaa, S.L. and Dickson, G.W. (1988), “Graphics and managerial decision making: research based Cognitive
guidelines”, Communications of the ACM, Vol. 31 No. 6, pp. 764-774.
biases in
Jääskeläinen, A. and Roitto, J.M. (2016), “Visualization techniques supporting performance
measurement system development”, Measuring Business Excellence, Vol. 20 No. 2, pp. 13-25. audits
Johnson, J.R., Rice, R.R. and Roemmich, R.A. (1980), “Pictures that lie: the abuse of graphs in annual
reports”, Management Accounting, Vol. 62 No. 4, pp. 50-56.
Joyce, E.J. and Biddle, G.C. (1981), “Anchoring and adjustment in probabilistic inference in auditing”,
Journal of Accounting Research, Vol. 19 No. 1, pp. 120-145.
13
Julesz, B. (1981), “Textons, the elements of texture perception, and their interactions”, Nature, Vol. 290
No. 5802, pp. 91-97.
Kahneman, D. and Tversky, A. (1972), “Subjective probability: a judgment of representativeness”,
Cognitive Psychology, Vol. 3 No. 3, pp. 430-454.
Kennedy, J. and Peecher, M.E. (1997), “Judging auditors’ technical knowledge”, Journal of Accounting
Research, Vol. 35 No. 2, pp. 279-293.
Kinney, W.R. Jr and Uecker, W.C. (1982), “Mitigating the consequences of anchoring in auditor
judgments”, The Accounting Review, Vol. 57 No. 1, pp. 55-69.
Knapp, M.C. and Knapp, C.A. (2012), “Cognitive biases in audit engagements: errors in judgment and
strategies for prevention”, The CPA Journal, Vol. 82 No. 6, pp. 40-45.
Kobsa, A. (2001), “An empirical comparison of three commercial information visualization systems”,
IEEE Symposium on Information Visualization, 2001. INFOVIS 2001. IEEE, pp. 123-130.
Kogan, A., Alles, M.G., Vasarhelyi, M.A. and Wu, J. (2014), “Design and evaluation of a continuous data
level auditing system”, Auditing: A Journal of Practice and Theory, Vol. 33 No. 4, pp. 221-245.
KPMG (2011), “Elevating professional judgment in auditing and accounting: the KPMG
professional judgment framework”, available at: www.drlillie.com/a544/kpmg/jdgmt/
KPMG_ProfJudgment_Monograph.pdf (accessed 23 May 2018).
KPMG (2012), “Leveraging data analytics and continuous auditing processes for improved audit
planning, effectiveness, and efficiency”, available at: https://assets.kpmg.com/content/dam/
kpmg/pdf/2016/05/Leveraging-Data-Analytics.pdf (accessed 23 May 2018).
KPMG (2015), “Seeing beyond the numbers: improving revenue cycle results through data visualization”,
available at: www.kpmg-institutes.com/institutes/healthcare-lifesciences-institute/articles/2015/01/
visualizing-rev-cycle-issue-brief.html (accessed 23 May 2018).
Lawrence, M. and O’Connor, M. (1993), “Scale, variability, and the calibration of judgmental prediction
intervals”, Organizational Behavior and Human Decision Processes, Vol. 56 No. 3, pp. 441-458.
Libby, R. (1985), “Availability and the generation of hypotheses in analytical review”, Journal of
Accounting Research, Vol. 23 No. 2, pp. 648-667.
Libby, R., Bloomfield, R. and Nelson, M.W. (2002), “Experimental research in financial accounting”,
Accounting, Organizations and Society, Vol. 27 No. 8, pp. 775-810.
Lindgaard, G., Fernandes, G., Dudek, C. and Brown, J. (2006), “Attention web designers: you have 50
milliseconds to make a good first impression!”, Behaviour and Information Technology, Vol. 25
No. 2, pp. 115-126.
Lurie, N.H. and Mason, C.H. (2007), “Visual representation: implications for decision making”, Journal
of Marketing, Vol. 71 No. 1, pp. 160-177.
MacGregor, D. and Slovic, P. (1986), “Graphic representation of judgmental information”, Human-
Computer Interaction, Vol. 2 No. 3, pp. 179-200.
Mandel, N. and Johnson, E.J. (2002), “When web pages influence choice: effects of visual primes on
experts and novices”, Journal of Consumer Research, Vol. 29 No. 2, pp. 235-245.
Murphy, M.L. and Tysiac, K. (2015), “Data analytics helps auditors gain deep insight”, Journal of
Accountancy, Vol. 219 No. 4, pp. 54-58.
MAJ Nisbett, R.E. and Ross, L. (1980), Human Inference: Strategies and Shortcomings of Social Judgment,
Prentice Hall, Englewood Cliffs, NJ.
36,1
Northcraft, G.B. and Neale, M.A. (1987), “Experts, amateurs, and real estate: an anchoring-and
adjustment perspective on property pricing decisions”, Organizational Behavior and Human
Decision Processes, Vol. 39 No. 1, pp. 84-97.
O’Clock, P. and Devine, K. (1995), “An investigation of framing and firm size on the auditor's going
concern decision”, Accounting and Business Research, Vol. 25 No. 99, pp. 197-207.
14
Paddrick, M.E., Haynes, R., Todd, A.E., Scherer, W.T. and Beling, P.A. (2016), “Visual analysis to
support regulators in electronic order book market”, Environment Systems and Decisions, Vol. 36
No. 1, pp. 167-182.
Phillips, B., Prybutok, V.R. and Peak, D.A. (2014), “Decision confidence, information usefulness, and
information seeking intention in the presence of disconfirming information”, Informing Science,
Vol. 17 No. 1, pp. 1-24.
Rose, A.M., Rose, J.M., Sanderson, K. and Thibodeau, J.C. (2017), “When should audit firms introduce
analysis of big data into the audit process?”, Journal of Information System, Vol. 31 No. 3,
pp. 81-99.
Shanteau, J. (1989), “Cognitive heuristics and biases in behavioral auditing: review, comments and
observations”, Accounting, Organizations and Society, Vol. 14 Nos 1/2, pp. 165-177.
Simkin, D. and Hastie, R. (1987), “An information processing analysis of graph perception”, Journal of
the American Statistical Association, Vol. 82 No. 398, pp. 454-465.
Singh, K. and Best, P. (2016), “Interactive visual analysis of anomalous accounts payable transactions
in SAP enterprise systems”, Managerial Auditing Journal, Vol. 31 No. 1, pp. 35-63.
Singh, K., Best, P. and Mula, J. (2013), “Automating vendor fraud detection in enterprise systems”, The
Journal of Digital Forensics, Security and Law, Vol. 8 No. 2, pp. 7-42.
Sloman, S.A. (1996), “The empirical case for two systems of reasoning”, Psychological Bulletin, Vol. 119
No. 1, pp. 3-22.
Stone, E.R., Yates, J.F. and Parker, A.M. (1997), “Effects of numerical and graphical displays on
professed risk-taking behavior”, Journal of Experimental Psychology: Applied, Vol. 3 No. 4,
pp. 243-256.
Tang, F., Hess, T.J., Valacich, J.S. and Sweeney, J.T. (2013), “The effects of visualization and
interactivity on calibration in financial decision-making”, Behavioral Research in Accounting,
Vol. 26 No. 1, pp. 25-58.
Treisman, A. (1985), “Preattentive processing in vision”, Computer Vision, Graphics, and Image
Processing, Vol. 31 No. 2, pp. 156-177.
Trotman, K.T., Tan, H.C. and Ang, N. (2011), “Fifty-year overview of judgment and decision making
research in accounting”, Accounting and Finance, Vol. 51 No. 1, pp. 278-360.
Tversky, A. and Kahneman, D. (1973), “Availability: a heuristic for judging frequency and probability”,
Cognitive Psychology, Vol. 5 No. 2, pp. 207-232.
Tversky, A. and Kahneman, D. (1974), “Judgment under uncertainty: heuristics and biases”, Science,
Vol. 185 No. 4157, pp. 1124-1131.
Valdez, A.C., Ziefle, M. and Sedlmair, M. (2018), “Priming and anchoring effects in visualization”, IEEE
Transactions on Visualization and Computer Graphics, Vol. 24 No. 1, pp. 584-594.
Vessey, I. and Galletta, D. (1991), “Cognitive fit: an empirical study of information acquisition”,
Information Systems Research, Vol. 2 No. 1, pp. 63-84.

Corresponding author
Chengyee Janie Chang can be contacted at: jchang@mail.sdsu.edu
Appendix Cognitive
biases in
audits

15

Figure A1.
Illustration of a
dashboard
MAJ
36,1

16

Figure A1.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy