Cognitive Bias
Cognitive Bias
https://www.emerald.com/insight/0268-6902.htm
Cognitive
Data visualization and cognitive biases in
biases in audits audits
Chengyee Janie Chang and Yan Luo
Charles W. Lamden School of Accountancy, San Diego State University,
San Diego, California, USA 1
Received 31 August 2017
Revised 24 May 2018
Abstract 5 October 2018
1 December 2018
Purpose – This paper aims to examine major cognitive biases in auditors’ analyses involving visualization, Accepted 31 December 2018
as well as proposes practical approaches to address such biases in data visualization.
Design/methodology/approach – Using the professional judgment framework of KPMG (2011), this
study performs an analysis of whether and how five major types of cognitive biases (framing, availability,
overconfidence, anchoring and confirmation) may occur in an auditor’s data visualization and how such
biases potentially compromise audit quality.
Findings – The analysis suggests that data visualization can trigger and/or aggravate the common
cognitive biases in audit. If not properly addressed, such biases may adversely affect auditors' judgment and
decision-making.
Practical implications – To ensure that data visualization improves audit efficiency and effectiveness, it
is essential that auditors are aware of and successfully address cognitive biases in data visualization. Six
practical approaches to debias cognitive biases in auditors’ visualization are proposed: using data
visualization to complement rather than supplement traditional audit evidence; positioning data visualization
to support rather than replace sophisticated analytics tools; using a dashboard with multiple dimensions;
using both visualized and tabular data in analyses; assigning experienced audit staff; and providing pre-audit
tutorials on cognitive bias and visualization.
Originality/value – The study raises awareness of psychological issues in an audit setting.
Keywords Cognitive bias, Audit
Paper type Research paper
1. Introduction
In the age of Big Data, stakeholders in capital markets (e.g. managers, investors, auditors,
government agencies, etc.) are increasingly overloaded with electronic information.
Auditing standards require auditors to obtain an understanding of an audit entity and its
environment to assess the risk of material misstatements in an audit engagement (AU
Section 314). The sheer amount of information generated by audited entities’ ever-increasing
and diverse use of information technologies challenges auditors’ ability to make effective
and efficient audit decisions. Technology-enabled data visualization has great potential to
improve audit efficiency and effectiveness by transforming large complex datasets into
high-level compact graphic representations of the data that can facilitate both the discovery
and communication of valuable and latent patterns. However, these benefits can only be
achieved if auditors are capable of mitigating the potential cognitive biases in data
visualization and are willing to subject the insights gained from visualization to more
sophisticated statistical analysis (Sloman, 1996; Lurie and Mason, 2007; Cao et al., 2015;
Hirsch et al., 2015; Paddrick et al., 2016). Managerial Auditing Journal
It has been demonstrated that the human brain can process more information at a faster Vol. 36 No. 1, 2021
pp. 1-16
rate when the information is presented graphically than when it is presented in text or © Emerald Publishing Limited
0268-6902
tabular formats (Lurie and Mason, 2007). Graphic displays allow auditors, especially novice DOI 10.1108/MAJ-08-2017-1637
MAJ auditors, to quickly comprehend the data, detect certain relevant patterns and identify
36,1 trends and relationships that would take much more effort to identify if presented in tables
(Vessey and Galletta, 1991; Singh and Best, 2016). Thus, visualization can be used in every
phase of an audit, including decisions to accept or continue an audit engagement, audit
planning, risk assessment, responding to risk assessments and tests of controls or
substantive testing (AICPA, 2015).
2 Currently, visualization is being used by some auditors to generate insights, increase the
accuracy of conclusions and improve the efficiency and effectiveness of the auditing process
(Singh et al., 2013; AICPA, 2015; Brown-Liburd et al., 2015; Cao et al., 2015; Singh and Best,
2016)[1]. Data visualization can be used to both explore and explain data (Alawadhi, 2015).
In exploratory data visualization, visualization tools (e.g. scatterplots, trend lines, bubble
charts, tables, etc.) are used before and during the process of gathering and evaluating audit
evidence to explore data relationships from various perspectives, discover new and
meaningful patterns and detect discontinuities, exceptions and outliers that might be
concealed in the operational and financial data of an audit client (Lurie and Mason, 2007;
Brown-Liburd et al., 2015; Paddrick et al., 2016). Exploratory data visualization can also be
used to analyze a whole population of transactions rather than a sample, which may make
some audit procedures more effective, as patterns, trends or anomalies in a population are
not always detectable when the procedures are applied to samples (Alles and Gray, 2014;
Brown-Liburd et al., 2015). Although more sophisticated statistical tools should be applied to
the data before any conclusions are drawn, preliminary exploratory visualization is useful
when the datasets are very large or if the specific nature and relationships of the data are not
clearly predefined. In contrast, explanatory data visualization usually occurs after auditors
have analyzed the data using more established, sophisticated tools (e.g. regression analysis).
It is commonly used to synthesize and communicate auditors’ main findings to convince the
viewers of the auditors’ conclusions and facilitate decision-making (Arunachalam et al.,
2002; Eppler and Aeschimann, 2009; Fisher, 2010; Alawadhi, 2015; Appelbaum et al., 2017).
In conclusion, exploratory data visualization techniques are used to analyze the data,
whereas explanatory data visualization techniques are used to communicate the results of
the analyses.
Prior research investigating the effects of visualization on decision-making in the context
of auditing suggests that data visualization enhances auditor judgment and audit quality
and can benefit audit procedures, such as accepting or continuing an audit engagement, risk
assessment and analytical procedures (Blocher et al., 1986; Kogan et al., 2014; Hirsch et al.,
2015; Cao et al., 2015). However, to the best of our knowledge, prior studies have not
examined the potential negative effects of visualization on decision-making in auditing. In
particular, it is unclear whether visualization’s effects on cognitive biases impact the
efficiency and effectiveness of audit procedures. This study explores whether and how
cognitive bias is triggered or aggravated by data visualization and how such biases
potentially compromise audit quality.
Cognitive bias refers to the tendency of individuals to make systematic mistakes in
judgment when making decisions (Kahneman and Tversky, 1972), and it has been identified
and studied in behavioral audit research (Trotman et al., 2011). To ensure that audit
decisions are based on relevant and trustworthy evidence, it is critical to understand how
any cognitive biases triggered or aggravated by explanatory and exploratory data
visualization might affect the process of gathering and evaluating audit evidence and thus
potentially compromise audit quality.
This study examines whether and how data visualization can trigger and/or aggravate
the five types of cognitive bias that are most relevant in auditing contexts: framing,
availability, overconfidence, anchoring and confirmation biases, specifically identified the Cognitive
KPMG (2011) professional judgment framework[2]. Prior research (Joyce and Biddle, 1981; biases in
Lurie and Mason, 2007) shows that these biases can be operationalized through the
manipulation of reference points, the format of the graphic presentation, or the vividness
audits
and evaluability of the data. Based on the insights from audit practitioners and regulators as
well as from extant psychology and behavior research (Benbasat and Dexter, 2986;
Hammersley, 2006; Lurie and Mason, 2007; KPMG, 2011; Singh and Best, 2016; Rose et al.,
2017), six approaches to debiasing auditors’ visualization analyses are identified: using data 3
visualization to complement rather than supplement traditional audit evidence; positioning
data visualization to support rather than replace sophisticated analytics tools; using a
dashboard with multiple dimensions; using both visualized and tabular data in analyses;
assigning experienced audit staff; and providing pre-audit tutorials on cognitive bias and
visualization.
This study has important implications for audit practitioners, standard setters, educators
and researchers. First, it identifies key cognitive biases that potentially constrain the
effectiveness of auditors’ use of data visualization in the audit process and identifies six
approaches that may mitigate these biases. Second, it suggests that audit standard setters
should consider adjusting auditing standards to clarify the role of data visualization and
should provide specific guidelines on how to use data visualization tools to perform audits
and on how to interpret and document audit evidence collected from visualization. Third, it
highlights the need to include visualization and potential cognitive biases in the auditing
curriculum. Last but not least, this study is a response to the call for research on both the
positive and negative effects of Big Data analytics (including visualization) on auditor
judgment and to investigate solutions/approaches that might mitigate any negative impact
on auditor judgment (Brown-Liburd et al., 2015).
The remainder of this paper is organized as follows. Section 2 defines the five common
cognitive biases in an audit setting, discusses how visualization may trigger or aggravate
specific biases and demonstrates how such biases impede the effectiveness and efficiency of
audit procedures, decision-making and judgment. Section 3 proposes techniques for
mitigating the cognitive biases that may occur when auditors use visualization methods.
The final section considers this study’s academic, regulatory and practical implications and
the challenges in applying visualization methods to audits.
Hullman and Diakopoulos (2011) point out that data visualization can anchor the users’
interpretations, which are most likely to be formed according to the dimensions of the data
used in the visualization analysis (e.g. the default view in a data visualization tool)[3]. This
reduces decision makers’ (e.g. auditors’) attention to other aspects of the data and the
likelihood that auditors will challenge the reasonableness of management’s estimations or
interpretations, especially when they are re-confirmed by other charts that appear when
users simply click on “view more charts” in the visualization tools (e.g. Tableau or Power BI)
without changing the dimensions being analyzed. Thus, visualizations can lead auditors to
overly rely on preliminary analyses, resulting in suboptimal audit procedures and final
judgments that are close to the client’s preferred initial direction/trend (KPMG, 2011; CAQ,
2014).
Notes
1. Visualization is widely used in other areas of the business community including accounting
(Dilla et al., 2010; Hirsch et al., 2015), supply chain management (Bendoly, 2016), performance
measurement systems (Jääskeläinen and Roito, 2016), strategic decision-making (Biloslavo et al.,
2012), financial investment (Tang et al., 2013) and marketing (Lurie and Mason, 2007).
2. Specifically, KPMG (2011, p. 23) suggest that their professional judgment framework covers five
common “tendencies that are most applicable and important for audit professionals [. . .] purpose
is to illustrate that the tendencies are common and that the related biases affect all of us.”
3. A dimension provides context/reference information about a business measurement, such as
revenue or cost of goods sold. Revenue by customer, revenue by division, or revenue by year
are examples of viewing the same key business measurement using different dimensions.
4. For example, Murphy and Tysiac (2015) mention that AICPA has established audit data standards
(ADS) to identify key data elements (e.g. naming, formatting, and levels of data fields) and tables/files
needed for data retrieval. These ADS provide a common framework for organizing data for external
audits. The AICPA audit data standards are available at www.aicpa.org/interestareas/frc/
assuranceadvisoryservices/pages/auditdatastandardwor kinggroup.aspx (accessed 30 November
2018).
References Cognitive
Alawadhi, A. (2015), “The application of data visualization in auditing”, PhD dissertation, Rutgers biases in
University.
audits
Alles, M. and and Gray, G. (2014), “Developing a framework for the role of big data in auditing: a
synthesis of the literature”, working paper, Rutgers Business School.
Amer, T. (1991), “An experimental investigation of multi-cue financial information display and decision
making”, Journal of Information Systems, Vol. 5 No. 2, pp. 18-34. 11
American Institute of Certified Public Accountants (AICPA) (2015), Audit Analytics and Continuous Audit:
Looking toward the Future, AICPA, New York, NY, available at: www.aicpa.org/interestareas/frc/
assuranceadvisoryservices/downloadabledocuments/auditanalytics_lookingtowardfuture.pdf (accessed
28 August 2017).
Appelbaum, D., Kogan, A. and Vasarhelyi, M.A. (2017), “Big data and analytics in the modern audit
engagement: research needs”, Auditing: A Journal of Practice and Theory, Vol. 36 No. 4, pp. 1-27.
Aruna, D., Balakrishnan, S.R., Fussell, S.K. and Kittur, A. (2010), “Pitfalls of information access with
visualizations in remote collaborative analysis”, Proceedings of the 2010 ACM Conference on
Computer Supported Cooperative Work (CSCW'10), ACM, New York, NY, pp. 411-420, available
at: http://doi.acm.org/10.1145/1718918.1718988
Arunachalam, V., Pei, B.K.W. and Steinbart, P.J. (2002), “Impression management with graphs: effects
on choices”, Journal of Information System, Vol. 16 No. 2, pp. 183-202.
Asare, S.K. and Wright, A.M. (2003), “A note on the interdependence between hypothesis generation
and information search in conducting analytical procedures”, Contemporary Accounting
Research, Vol. 20 No. 2, pp. 235-251.
Ayers, S. and Kaplan, S.E. (1993), “An examination of the effect of hypothesis framing on auditors’
information choices in an analytical procedure task”, Abacus, Vol. 29 No. 2, pp. 113-130.
Beattie, V. and Jones, M.J. (1992), “The use and abuse of graphs in annual reports: theoretical
framework and empirical study”, Accounting and Business Research, Vol. 22 No. 88, pp. 291-303.
Benbasat, I. and Dexter, A.S. (1985), “An experimental evaluation of graphical and colorenhanced
information presentation”, Management Science, Vol. 31 No. 11, pp. 1348-1364.
Benbasat, I. and Dexter, A.S. (1986), “An investigation of the effectiveness of color and graphical
information presentation under varying time constraints”, MIS Quarterly, Vol. 10 No. 1, pp. 59-83.
Bendoly, E. (2016), “Fit, bias, and enacted sensemaking in data visualization: frameworks for
continuous development in operations and supply chain management analytics”, Journal of
Business Logistics, Vol. 37 No. 1, pp. 6-17.
Bettman, J.R. and Kakkar, P. (1977), “Effects of information presentation format on consumer
information acquisition strategies”, Journal of Consumer Research, Vol. 3 No. 4, pp. 233-240.
Biloslavo, R., Kregar, T.B. and Gorela, K. (2012), “Using visualization for strategic decision making: a
case of Slovenian entrepreneurs”, Proceedings of the 13th European Conference on Knowledge
Management, Vol. 1, pp. 83-92.
Blocher, E., Moffie, R.P. and Zmud, R.W. (1986), “Report format and task complexity: interaction in risk
judgments”, Accounting, Organizations and Society, Vol. 11 No. 6, pp. 457-470.
Bonner, S.E. and Walker, P.L. (1994), “The effects of instruction and experience on the acquisition of
auditing knowledge”, The Accounting Review, Vol. 69 No. 1, pp. 157-178.
Bowtell, J., Danson, F., Gonnella, N. and and Steiger, M. (2014), “Data analytics and workforce
strategies: new insights for performance improvement and tax efficiency”, White Paper 12,
Deloitte.
Brown-Liburd, H., Issa, H. and Lombardi, D. (2015), “Behavioral implications of big data’s impact on
audit judgment and decision making and future research directions”, Accounting Horizons,
Vol. 29 No. 2, pp. 451-468.
MAJ Cao, M., Chychyla, R. and Stewart, T. (2015), “Big data analytics in financial statement audits”,
Accounting Horizons, Vol. 29 No. 2, pp. 423-429.
36,1
Center for Audit Quality (CAQ) (2014), “Professional judgment resource”, available at: www.thecaq.org/
docs/reports-and-publications/professional-judgment-resource.pdf?sfvrsn=4 (accessed 23 May
2018).
Chang, C.J., Yen, S.H. and Duh, R.-R. (2002), “An empirical examination of competing theories to explain
the framing effect in accounting-related decisions”, Behavioral Research in Accounting, Vol. 14
12 No. 1, pp. 35-64.
Committee of Sponsoring Organizations of the Treadway Commission (COSO) (2012), “Enhancing
board oversight: avoiding judgment traps and biases”, available at: www.coso.org/documents/
coso-enhancingboardoversight_r8_webready%20(2).pdf
Dilla, W., Janvrin, D.J. and Raschke, R. (2010), “Interactive data visualization: new directions for
accounting information systems research”, Journal of Information Systems, Vol. 24 No. 2,
pp. 1-37.
Eppler, M.J. and Aeschimann, M. (2009), “A systematic framework for risk visualization in risk
management and communication”, Risk Management, Vol. 11 No. 2, pp. 67-89.
Fagley, N.S. (1993), “A note concerning reflection effects versus framing effects”, Psychological Bulletin,
Vol. 113 No. 3, pp. 451-452.
Fay, R.G. and Montague, N.R. (2014), “Witnessing your own cognitive bias: a compendium of classroom
exercises”, Issues in Accounting Education, Vol. 30 No. 1, pp. 13-34.
Fisher, D. (2010), “Animation for visualization: opportunities and drawbacks”, in Steele, J. and Iliinsky,
N. (Eds), Beautiful Visualization: Looking at Data through the Eyes of Experts, O’Reilly,
Sebastopol, pp. 329-352.
Glazer, R., Steckel, J.H. and Winer, R.S. (1992), “Locally rational decision making: the distracting effect
of information on managerial performance”, Management Science, Vol. 38 No. 2, pp. 212-226.
Glover, S.M. and Prawitt, D.F. (2013), “Enhancing auditor professional skepticism”, available at: www.
researchgate.net/publication/258419768_Enhancing_Auditor_Professional_Skepticism (accessed
23 May 2018).
Green, M. (1998), “Toward a perceptual science of multidimensional data visualization: Bertin and beyond”,
available at: https://pdfs.semanticscholar.org/b5fd/6166cf2264100a403e1fda019d9e9c5c6303.pdf
(accessed 23 May 2018).
Hammersley, J.S. (2006), “Pattern identification and industry specialist auditors”, The Accounting
Review, Vol. 81 No. 2, pp. 309-336.
Healey, C.G., Booth, K.S. and Enns, J.T. (1995), “Visualizing real-time multivariate data using
preattentive processing”, ACM Transactions on Modeling and Computer Simulation, Vol. 5
No. 3, pp. 190-221.
Hirsch, B., Seubert, A. and Sohn, M. (2015), “Visualisation of data in management accounting reports:
How supplementary graphs improve every-day management judgments”, Journal of Applied
Accounting Research, Vol. 16 No. 2, pp. 221-239.
Hsee, C.K. (1996), “The evaluability hypothesis: an explanation for preference reversals between joint
and separate evaluations of alternatives”, Organizational Behavior and Human Decision
Processes, Vol. 67 No. 3, pp. 247-257.
Hullman, J. and Diakopoulos, N. (2011), “Visualization rhetoric: framing effects in narrative
visualization”, IEEE Transactions on Visualization and Computer Graphics, Vol. 17 No. 12,
pp. 2231-2240.
Jarvenpaa, S.L. (1989), “The effect of task demands and graphical format on information processing
strategies”, Management Science, Vol. 35 No. 3, pp. 285-303.
Jarvenpaa, S.L. (1990), “Graphic displays in decision making—the visual salience effect”, Journal of
Behavioral Decision Making, Vol. 3 No. 4, pp. 247-262.
Jarvenpaa, S.L. and Dickson, G.W. (1988), “Graphics and managerial decision making: research based Cognitive
guidelines”, Communications of the ACM, Vol. 31 No. 6, pp. 764-774.
biases in
Jääskeläinen, A. and Roitto, J.M. (2016), “Visualization techniques supporting performance
measurement system development”, Measuring Business Excellence, Vol. 20 No. 2, pp. 13-25. audits
Johnson, J.R., Rice, R.R. and Roemmich, R.A. (1980), “Pictures that lie: the abuse of graphs in annual
reports”, Management Accounting, Vol. 62 No. 4, pp. 50-56.
Joyce, E.J. and Biddle, G.C. (1981), “Anchoring and adjustment in probabilistic inference in auditing”,
Journal of Accounting Research, Vol. 19 No. 1, pp. 120-145.
13
Julesz, B. (1981), “Textons, the elements of texture perception, and their interactions”, Nature, Vol. 290
No. 5802, pp. 91-97.
Kahneman, D. and Tversky, A. (1972), “Subjective probability: a judgment of representativeness”,
Cognitive Psychology, Vol. 3 No. 3, pp. 430-454.
Kennedy, J. and Peecher, M.E. (1997), “Judging auditors’ technical knowledge”, Journal of Accounting
Research, Vol. 35 No. 2, pp. 279-293.
Kinney, W.R. Jr and Uecker, W.C. (1982), “Mitigating the consequences of anchoring in auditor
judgments”, The Accounting Review, Vol. 57 No. 1, pp. 55-69.
Knapp, M.C. and Knapp, C.A. (2012), “Cognitive biases in audit engagements: errors in judgment and
strategies for prevention”, The CPA Journal, Vol. 82 No. 6, pp. 40-45.
Kobsa, A. (2001), “An empirical comparison of three commercial information visualization systems”,
IEEE Symposium on Information Visualization, 2001. INFOVIS 2001. IEEE, pp. 123-130.
Kogan, A., Alles, M.G., Vasarhelyi, M.A. and Wu, J. (2014), “Design and evaluation of a continuous data
level auditing system”, Auditing: A Journal of Practice and Theory, Vol. 33 No. 4, pp. 221-245.
KPMG (2011), “Elevating professional judgment in auditing and accounting: the KPMG
professional judgment framework”, available at: www.drlillie.com/a544/kpmg/jdgmt/
KPMG_ProfJudgment_Monograph.pdf (accessed 23 May 2018).
KPMG (2012), “Leveraging data analytics and continuous auditing processes for improved audit
planning, effectiveness, and efficiency”, available at: https://assets.kpmg.com/content/dam/
kpmg/pdf/2016/05/Leveraging-Data-Analytics.pdf (accessed 23 May 2018).
KPMG (2015), “Seeing beyond the numbers: improving revenue cycle results through data visualization”,
available at: www.kpmg-institutes.com/institutes/healthcare-lifesciences-institute/articles/2015/01/
visualizing-rev-cycle-issue-brief.html (accessed 23 May 2018).
Lawrence, M. and O’Connor, M. (1993), “Scale, variability, and the calibration of judgmental prediction
intervals”, Organizational Behavior and Human Decision Processes, Vol. 56 No. 3, pp. 441-458.
Libby, R. (1985), “Availability and the generation of hypotheses in analytical review”, Journal of
Accounting Research, Vol. 23 No. 2, pp. 648-667.
Libby, R., Bloomfield, R. and Nelson, M.W. (2002), “Experimental research in financial accounting”,
Accounting, Organizations and Society, Vol. 27 No. 8, pp. 775-810.
Lindgaard, G., Fernandes, G., Dudek, C. and Brown, J. (2006), “Attention web designers: you have 50
milliseconds to make a good first impression!”, Behaviour and Information Technology, Vol. 25
No. 2, pp. 115-126.
Lurie, N.H. and Mason, C.H. (2007), “Visual representation: implications for decision making”, Journal
of Marketing, Vol. 71 No. 1, pp. 160-177.
MacGregor, D. and Slovic, P. (1986), “Graphic representation of judgmental information”, Human-
Computer Interaction, Vol. 2 No. 3, pp. 179-200.
Mandel, N. and Johnson, E.J. (2002), “When web pages influence choice: effects of visual primes on
experts and novices”, Journal of Consumer Research, Vol. 29 No. 2, pp. 235-245.
Murphy, M.L. and Tysiac, K. (2015), “Data analytics helps auditors gain deep insight”, Journal of
Accountancy, Vol. 219 No. 4, pp. 54-58.
MAJ Nisbett, R.E. and Ross, L. (1980), Human Inference: Strategies and Shortcomings of Social Judgment,
Prentice Hall, Englewood Cliffs, NJ.
36,1
Northcraft, G.B. and Neale, M.A. (1987), “Experts, amateurs, and real estate: an anchoring-and
adjustment perspective on property pricing decisions”, Organizational Behavior and Human
Decision Processes, Vol. 39 No. 1, pp. 84-97.
O’Clock, P. and Devine, K. (1995), “An investigation of framing and firm size on the auditor's going
concern decision”, Accounting and Business Research, Vol. 25 No. 99, pp. 197-207.
14
Paddrick, M.E., Haynes, R., Todd, A.E., Scherer, W.T. and Beling, P.A. (2016), “Visual analysis to
support regulators in electronic order book market”, Environment Systems and Decisions, Vol. 36
No. 1, pp. 167-182.
Phillips, B., Prybutok, V.R. and Peak, D.A. (2014), “Decision confidence, information usefulness, and
information seeking intention in the presence of disconfirming information”, Informing Science,
Vol. 17 No. 1, pp. 1-24.
Rose, A.M., Rose, J.M., Sanderson, K. and Thibodeau, J.C. (2017), “When should audit firms introduce
analysis of big data into the audit process?”, Journal of Information System, Vol. 31 No. 3,
pp. 81-99.
Shanteau, J. (1989), “Cognitive heuristics and biases in behavioral auditing: review, comments and
observations”, Accounting, Organizations and Society, Vol. 14 Nos 1/2, pp. 165-177.
Simkin, D. and Hastie, R. (1987), “An information processing analysis of graph perception”, Journal of
the American Statistical Association, Vol. 82 No. 398, pp. 454-465.
Singh, K. and Best, P. (2016), “Interactive visual analysis of anomalous accounts payable transactions
in SAP enterprise systems”, Managerial Auditing Journal, Vol. 31 No. 1, pp. 35-63.
Singh, K., Best, P. and Mula, J. (2013), “Automating vendor fraud detection in enterprise systems”, The
Journal of Digital Forensics, Security and Law, Vol. 8 No. 2, pp. 7-42.
Sloman, S.A. (1996), “The empirical case for two systems of reasoning”, Psychological Bulletin, Vol. 119
No. 1, pp. 3-22.
Stone, E.R., Yates, J.F. and Parker, A.M. (1997), “Effects of numerical and graphical displays on
professed risk-taking behavior”, Journal of Experimental Psychology: Applied, Vol. 3 No. 4,
pp. 243-256.
Tang, F., Hess, T.J., Valacich, J.S. and Sweeney, J.T. (2013), “The effects of visualization and
interactivity on calibration in financial decision-making”, Behavioral Research in Accounting,
Vol. 26 No. 1, pp. 25-58.
Treisman, A. (1985), “Preattentive processing in vision”, Computer Vision, Graphics, and Image
Processing, Vol. 31 No. 2, pp. 156-177.
Trotman, K.T., Tan, H.C. and Ang, N. (2011), “Fifty-year overview of judgment and decision making
research in accounting”, Accounting and Finance, Vol. 51 No. 1, pp. 278-360.
Tversky, A. and Kahneman, D. (1973), “Availability: a heuristic for judging frequency and probability”,
Cognitive Psychology, Vol. 5 No. 2, pp. 207-232.
Tversky, A. and Kahneman, D. (1974), “Judgment under uncertainty: heuristics and biases”, Science,
Vol. 185 No. 4157, pp. 1124-1131.
Valdez, A.C., Ziefle, M. and Sedlmair, M. (2018), “Priming and anchoring effects in visualization”, IEEE
Transactions on Visualization and Computer Graphics, Vol. 24 No. 1, pp. 584-594.
Vessey, I. and Galletta, D. (1991), “Cognitive fit: an empirical study of information acquisition”,
Information Systems Research, Vol. 2 No. 1, pp. 63-84.
Corresponding author
Chengyee Janie Chang can be contacted at: jchang@mail.sdsu.edu
Appendix Cognitive
biases in
audits
15
Figure A1.
Illustration of a
dashboard
MAJ
36,1
16
Figure A1.