0% found this document useful (0 votes)
30 views14 pages

Trigit Paper

Trigit is a free web application designed for rapid colorimetric analysis of images, significantly improving the speed and user-friendliness compared to existing tools like ImageJ. It allows users to quickly quantify color signals from images in various color spaces, with processing times of approximately 1.5 minutes for new users and under 20 seconds for experienced users. The application can be accessed from smartphones and does not require installation, making it convenient for on-site color extraction.

Uploaded by

Adaa CH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views14 pages

Trigit Paper

Trigit is a free web application designed for rapid colorimetric analysis of images, significantly improving the speed and user-friendliness compared to existing tools like ImageJ. It allows users to quickly quantify color signals from images in various color spaces, with processing times of approximately 1.5 minutes for new users and under 20 seconds for experienced users. The application can be accessed from smartphones and does not require installation, making it convenient for on-site color extraction.

Uploaded by

Adaa CH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Biosensors and Bioelectronics: X 14 (2023) 100361

Contents lists available at ScienceDirect

Biosensors and Bioelectronics: X


journal homepage: www.journals.elsevier.com/biosensors-and-bioelectronics-x

Trigit: A free web application for rapid colorimetric analysis of images


Angie Davina Tjandra, Tristan Heywood, Rona Chandrawati *
School of Chemical Engineering and Australian Centre for Nanomedicine (ACN), The University of New South Wales (UNSW Sydney), Sydney, NSW, 2052, Australia

A R T I C L E I N F O A B S T R A C T

Keywords: Color visualization is one of the fundamental aspects in advancing many scientific fields. For example, moni­
Colorimetric analysis toring color change has been used to identify abnormalities in health, environmental, water, agricultural, and
Image processing food quality. Human color vision is subjective, and objective colorimetric analysis is essential to prevent
Color extraction tool
incorrect deductions. ImageJ program is commonly used to process color images. Although versatile, ImageJ is
RGB extraction
CMYK extraction
neither designed nor optimized for color extraction and analysis. As such, the color extraction process is slow and
HSV extraction the user interface is not intuitive. To streamline this process, we designed Trigit (http://trigit.com.au) as a free,
CIELAB extraction rapid, and user-friendly web app to quantify color signals from images. The estimated analysis time is ~1.5 min
for new users and <20 s for experienced users, independent of the color space (CIELAB, HSV, CMYK, RGB, HEX).
Compared to ImageJ, Trigit is 4–9 times and 10–15 times faster in extracting RGB and non-RGB color values,
respectively, in a format that can be easily exported to a Microsoft Excel program. Trigit can be pre-downloaded
into portable drives to enable use without internet access. Trigit can also be accessed from smartphones for on-
site color extraction. We believe that Trigit’s intuitive and rapid color extraction capability could accelerate the
advancement of colorimetric technologies by alleviating the unnecessary time spent during color extraction.

1. Introduction aspect in many fields, particularly in scientific applications. For


example, relying on human vision to detect color changes in optical
Color is a fundamental aspect of human’s visual experience. Human colorimetric sensors is beneficial as it unlocks the feasibilities of low-
color vision can be explained by integrating physics and biology. Ac­ cost, portable and on-site testing, which bypasses the need for costly
cording to Newton’s Opticks study in physics, the color we perceive is the specialized laboratory instruments (Nguyen et al., 2019, 2020; Mazur
reflected color (light wavelength) from the surface of an object and the et al., 2020). Visualizing the change in skin color could predict one’s
remaining wavelengths are absorbed by the object (Newton et al., health and disease progression (Ly et al., 2020). Similarly, plant diseases
1721). In biology, human color vision is governed by two types of can be identified by observing leaf color (Nguy-Robertson et al., 2015;
photoreceptor cells in the retina, which are the cone and rod cells (Fain Arnal Barbedo, 2013). Despite its merits, the human vision detection
and Sampath, 2018). The cone cells are dominant under relatively bright method is subjective and user-dependent, which may render inaccurate
conditions and respond uniquely to the wavelengths in the visible light results. To ameliorate this, objective quantification of colors through
spectrum corresponding to the 3 primary colors including red, green, empirical models of the color space is required (Woolf et al., 2021).
and blue (RGB). In contrast, rod cells are active under dim conditions, Recently, there is a rising interest in quantitative color measure­
switching our vision from chromatic to an almost or fully achromatic ments of images obtained from low-cost commercial technologies such
view. Based on the types and number of stimulated cells, the human eye as smartphones, scanners, and digital cameras for scientific applications
and brain then combine this information to produce a unique signal (Alizadeh et al., 2019; Wang et al., 2020; Weston et al., 2020a, 2021;
corresponding to a color. The number of distinguishable shades varies in Choi et al., 2015). The development of inexpensive imaging technolo­
each person and this value can range from 1 to 100 million colors gies has made objective colorimetric detection more feasible during the
(Jordan et al., 2010). image acquisition stage. The assurance of consistent lighting conditions
Identifying color changes via the naked eye has been a foundational has also been attempted by fabricating a light box equipped with a

Abbreviations: RGB, Red Green Blue; CMYK, Cyan Magenta Yellow Key; HSV, Hue Saturation Value; ROI, Region of Interest; UI, User Interface; OS, Operating
System.
* Corresponding author.
E-mail address: rona.chandrawati@unsw.edu.au (R. Chandrawati).

https://doi.org/10.1016/j.biosx.2023.100361
Received 6 March 2023; Accepted 20 May 2023
Available online 29 May 2023
2590-1370/© 2023 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-
nc-nd/4.0/).
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

camera or a smartphone stand (Chung et al., 2015; Kim et al., 2017; et al., 2012, 2015; Weston et al., 2020d). In colorimetric sensors, the
Chen et al., 2019). Although considerable efforts have been made to presence and concentration of target analytes are measured by the de­
ensure consistent image acquisition, less attention is given on improving gree of color change. The quality of the results does not only depend on
the image processing workflow, particularly on reducing the processing the change in hue and intensity, but also on the way the users sample the
time. This is because developing tailor-made software, plug-ins, and sensors’ area. Color sampling method is imperative particularly when
applications for image analysis requires advanced programming there is an uneven color change across the sensing area as an inconsis­
knowledge and/or expertise in mathematics, pattern recognition, and tent method leads to greater variability. In addition, it is essential that
color spaces. sampling is done by averaging the color values from the numerous pixels
Several color analysis software that are commonly used by re­ instead of selecting a single pixel in the sensor’s ROI. For example,
searchers include ImageJ, Adobe Photoshop, MATLAB, Pantone Studio, Fig. 1a shows the variable RGB values when a single pixel is sampled
Digital Colorimeter by Apple, and others customized by the developers. from different areas in the same sensor with uneven color change.
It is worth noting that depending on the application, some of these Variability is also present even when the sensor appeared to have similar
software may not be suitable for color analysis. For example, if the user color distribution to the eye (Fig. 1b). Both cases demonstrated the
requires a color “picker” feature which requires only a single pixel to importance of averaging ROI for objective measurements, particularly
extract the color information, then Digital Colorimeter and Pantone for sensors that have uneven color distribution.
Studio may be suitable. However, if the color analysis is an area con­ More recently, smartphone applications or apps such as Color Picker
taining multiple pixels, then ImageJ, Adobe Photoshop, or MATLAB (Alizadeh et al., 2019; Wang et al., 2020), Color Name (He et al., 2019),
would be the most appropriate. Currently, a very limited number of Color Grab (Xiao et al., 2019), and ColorLab (Hosu et al., 2019) have
software is available for area-based analysis as the development of this been explored for color quantification to assist results interpretation
feature is time consuming, requires programming skills and deep un­ where the detection test is being conducted (on-site). Whilst quick re­
derstanding of image analysis. ImageJ (Schneider et al., 2012; Parolo sults can be generated, these apps report values based on a single pixel
et al., 2020) and Adobe Photoshop (Choi et al., 2015; Jia et al., 2015) are selected instead of averaged values of the selected area that contains
the two most common programs used to extract area-based color in­ many pixels, which means that results are subjective.
formation from images as they require minimum programming skills Although considerable progress has been made to streamline color
compared to MATLAB. ImageJ (https://imagej.nih.gov/ij/docs/intro.ht analysis workflow, there is currently no free web application (web app)
ml) by the National Institutes of Health (NIH) is free and exists as a web that can rapidly extract colorimetric data intuitively. In this paper, we
application (web app) or downloadable software. ImageJ’s strength is its present a free and intuitively designed web application called Trigit
extensibility to plug-ins written using other programs other than its (http://trigit.com.au) to overcome the limitations of the existing tools.
native Javascript program, including R, Python, MATLAB, and others Trigit can rapidly process images, automatically identify output zones
(Rueden et al., 2017). In contrast, Adobe Photoshop requires a paid regardless of their size and shape, and eliminate requirements for time
subscription, though it is usually made available for researchers from consuming and user-dependent image processing capabilities. The
their research institute. Despite their robustness, these programs require motivation behind Trigit is to re-engineer ImageJ’s RGB measure tool by
long processing times from color extraction to data saving, approxi­ improving its workflow to meet the scientific demands whilst main­
mately 12 min for ImageJ and 24 min for Adobe Photoshop (Parker taining accessibility to non-specialized users and experts in the field.
et al., 2020). Rueden and colleagues rewrote ImageJ’s open-source Furthermore, we selected a web app rather than a downloadable soft­
codebase to develop the extended version called ImageJ2 to expand ware to avoid issues with installation and device storage space.
its application, such as enabling the analysis of highly dimensional
datasets (Rueden et al., 2017). Woolf et al. developed a “Crop-and-Go” 2. Methods
protocol to override ImageJ that enables masking and automatic region
of interest (ROI) detection (Woolf et al., 2021). Crop-and-Go in partic­ 2.1. Trigit user interface (UI)
ular, is unique as it can isolate irregular shapes by adjusting its masking
threshold. However, as both programs are based on ImageJ’s original 2.1.1. Web app builder
architecture, the user interface designs are not user-friendly, particu­ Our primary goal in developing Trigit was to make the user experi­
larly for those who are not well-versed in design-related phrases ence as frictionless as possible. To this end, we decided to create a web
(threshold, masking, contours, presets, etc.). app (website), as opposed to a traditional desktop app. A web app allows
Troscianko and Stevens (2015) developed an ImageJ-embedded new users to get started with Trigit in a matter of seconds, simply by
toolbox or plug-ins to process RAW files from images. Plug-ins perfor­ navigating to the site in their web browser. A desktop app, on the other
mance was assessed based on its ability to generate linear response per
image across various exposure times. Performance validation was done
by comparing values generated from a spectrometer. However, almost
like all the plug-ins designed for ImageJ, the toolbox was designed for
use in biology and life sciences research (Arena et al., 2017). The only
tool that is not embedded on ImageJ and exists as a free downloadable
software is ColorScan (Parker et al., 2020). ColorScan contains useful
features such as preset saving, histogram plots, zone image export,
refining tools (masking, shape and size tolerance, find contours), and
others. However, although the authors claimed that color extraction
only requires 2 min, the full user guide showed that there are many steps
required to start the program. Furthermore, Python and OpenCV
installation are required, the program script must be downloaded from
GitHub, and the user interface (wording and design) is not intuitive.
Fig. 1. Variability in RGB values when selecting a single pixel or by averaging
A simple tool that enables rapid color extraction and analysis is
an area in colorimetric sensors which has (a) uneven and (b) even color change.
valuable for colorimetric sensors, which have been widely used in The area selected is bound by the neon green circle, whereas the single pixel
various applications, such as to monitor environmental conditions selected is represented as the red dot. RGB values are presented as [R, G, B].
(Dolai et al., 2017), food quality (Weston et al., 2020b, 2020c, 2022), (For interpretation of the references to color in this figure legend, the reader is
contaminants (Mazur et al., 2020), and diagnose diseases (Mazzone referred to the Web version of this article.)

2
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

hand, requires installation on each user’s computer. This process is time 2018; Phonchai et al., 2019). In this study, we used
consuming and may be difficult or impossible for users with computers polydiacetylene-based sensors due to its ease of fabrication and ability to
managed by an organization who may not have permission to install generate multitude colors. Polydiacetylenes have also been developed in
applications without administrator approval. many applications such as disease diagnosis (de Oliveira et al., 2015;
Developing a web app additionally allows us to take advantage of Doerflinger et al., 2019), environmental (Cho and Jung, 2018), and food
HTML5 – a set of powerful technologies which greatly simplify the task quality monitoring (Weston et al., 2022).
of creating an intuitive and modern-feeling UI. This contrasts with
desktop applications written in Java or Python, which often use the 2.3.1. Materials
Swing and Tkinter UI frameworks respectively. These frameworks pro­ 10,12-Pentacosadiynoic acid (PCDA; >97%) and 10,12-Tricosadiy­
duce dated-looking UIs and require considerable effort to achieve noic acid (TCDA; >98%) were purchased from Sigma Aldrich. 4,6-Hep­
functionality which would be simple to implement using HTML5. tadecadiynoic acid (HPDA; >97%) was purchased from FUJIFILM Wako
Despite being a web app, the architecture of Trigit is more similar to Chemicals. Whatman® Grade 1 filter paper (nominal thickness, 180 μm;
a traditional desktop app wherein all image processing is performed in typical particle retention, 11 μm; material, cellulose) was obtained from
the user’s browser, rather than on a web server. Crucially, this means GE Healthcare Life Sciences. Absolute ethanol was obtained from
that images imported into the Trigit never leave the user’s computer – a ChemSupply. Chloroform was obtained from VWR Chemicals. Ultrapure
very important consideration for confidentiality-conscious researchers. water (18.2 MΩ cm− 1 resistance) was provided by arium® pro Ultrapure
As an additional benefit, this architecture enables us to provide Trigit to Water Systems (Sartorius). Chemicals were used as received without
our users for free. further purification.
To perform the required sophisticated image processing operations
on (potentially) very large images at reasonable speeds, we utilize 2.3.2. Fabrication of polydiacetylene paper sensors
OpenCV.js. OpenCV is a powerful, open source image processing library, UV polymerization of pristine unmodified diacetylene (DA) mono­
and is a standard choice for an application such as this. OpenCV.js is a mers generates blue color polydiacetylene (PDA), which then changes to
JavaScript binding for a select subset of OpenCV functions, allowing red when exposed to stimuli e.g., analytes, pH and temperature changes
these functions to be utilized inside a web app. OpenCV.js is created by (Tjandra et al., 2021, 2022). During its transition, purple color can be
compiling OpenCV to Web Assembly, which allows the OpenCV.js generated. When its molecular structure is modified, PDA can yield
functions to run at comparable speeds to their OpenCV counterparts. other colors (Yoo et al., 2018; Shim et al., 2017; Yoon et al., 2013; Park
Note that traditionally, code written for web apps had to be written in et al., 2013). Herein, we synthesized PDA with various blue saturation
JavaScript, a considerably slower language than lower-level languages by varying UV polymerization time. PDA of different hues were pro­
such as C/C++, which can be used within desktop apps. duced by chemically modifying its structure, from carboxylic acid to
Primarily for ease of development, Trigit was written using the React aminium iodide functional group.
UI framework, along with the TypeScript language. These choices have
no impact on the end user. However, both technologies are industry 2.3.2.1. PDA with different blue saturations. PCDA monomers were dis­
standards for web development. Therefore, making it easy for other solved in absolute ethanol at a concentration of 50 mM. The solution was
interested members of the research community to either contribute to, sonicated to dissolve the monomers and passed through a 0.45 μm PTFE
or build on top of, Trigit’s open-source codebase. filter to remove large aggregates. 3 μL of the PCDA solution was drop­
casted onto a filter paper and air-dried for at least 10 min. The PCDA-
2.2. Performance evaluation containing paper was photopolymerized with UV (254 nm, UV lamp
Spectro-UV ENF-260C/FA, 6W) for 2, 5, or 10 s to yield blue PDA with
To demonstrate Trigit’s suitability to replace the current mostly used various color saturations. The distance between the UV source and the
color analysis tool (ImageJ), Trigit’s performance was evaluated in paper was fixed at 4 cm.
various aspects listed in Table 1. The performance indicators include the
difference of values generated, in absolute value or percentage differ­ 2.3.2.2. PDA with different hues. In addition to blue, red, and purple
ence. We also compared Trigit’s processing time against ImageJ. Images PDA (derived from DA monomers with carboxylic acid groups), we also
with alpha channel (transparent background) can still be analyzed. tested on PDA with yellow, green, and pink hues (derived from DA
However, the transparent field will appear as black and will be treated as monomers with iodide (IO) functional groups). Stock solutions of un­
a black surface (i.e: will show RGB [0,0,0]) as shown in Fig. S1 (Sup­ modified and modified DA were prepared in separate vials: 50 mM
porting Information). PCDA, 80 mM TCDA, 80 mM HPDA, 40.6 mM PCDA-IO, and 53.6 mM
HPDA-IO. PCDA, TCDA, and HPDA monomers were dissolved in
2.3. Colorimetric polydiacetylene-based paper sensors ethanol, whereas PCDA-IO and HPDA-IO monomers were dissolved in
ethanol:chloroform (1:1 v/v). Stock solutions were then mixed in
We tested the application of Trigit to extract color from colorimetric various molar ratios, dropcasted (3 μL) onto a filter paper, dried for at
paper sensors with various colors. Various colorimetric sensor materials least 15 min, and UV-ed for 15 s. The composition and molar ratios of
that are widely used include gold nanoparticles (Mazur et al., 2020; Li PDA with different hues are: H1 (HPDA/PCDA-IO 1:1), H2 (HPDA/
et al., 2017), metal oxides (Alizadeh et al., 2019; Li et al., 2019), met­ HPDA-IO 1:1), H3 (HPDA/PCDA-IO 3:1), H4 (TCDA/PCDA-IO 1:1), P1
alloporphyrins (Mazzone et al., 2012), and polydiacetylenes (Park et al., (TCDA/PCDA-IO 2:1), P2 (PCDA/PCDA-IO 2:1) and P3 (TCDA/PCDA-IO
1:1). For H1, H2, H3, P2, and P3, the sensors were left overnight under
Table 1 ambient light. Red H4 was generated by exposing the paper sensor to
Evaluation points and their variable parameters used to assess Trigit’s suitability chloroform. Purplish-pink P1 was generated by exposing the paper
for image analysis. sensor to moisture.
Evaluation point Variable parameters

Ability to process different formats and • Image format (PNG, JPEG, TIFF)
2.3.3. Image acquisition
image resolution • 75–1200 dpi The paper sensors were scanned on a flatbed scanner (Epson
Cross platform operation capability in OS include Windows and Mac OS. Perfection V39). Scanner settings were photo mode with various ranges
various operating systems (OS) and Browser types are Google Chrome, of dpi (75, 150, 300, 600, 1200 dpi), color management was color sync,
web browser type Microsoft Edge, and Safari
and target was sRGB. All files were scanned in PNG format unless
Processing time compared to ImageJ Tools used (Trigit versus ImageJ)

3
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

specified. 3. Results & discussions

2.3.4. Comparison with ImageJ The properties of an image, whether physical print or digital, is
ImageJ 1.53e (downloadable version) was used. ROI was manually defined by several parameters such as pixels, resolution, color depth,
selected, and values were generated using the “RGB Measure” plugin and color spaces. Understanding these properties is crucial to know their
(Plugins > Analyze > RGB Measure). Results generated in the pop-up effects on the quality of the image where color extraction is performed.
window were manually copied and pasted into a Microsoft Excel
spreadsheet. For CIELAB and HSV color spaces, we used downloadable
3.1. Image pixels, resolution, color depth, and color spaces
plugin Color Transformer (https://imagej.nih.gov/ij/plugins/color-tran
sforms.html) to convert the RGB image to CIELAB or HSV. This will
CIE 1931 XYZ by the International Commission on Illumination (CIE)
split the RGB image to 3 channels. For example, RGB to L, A, and B, or
was the first color space developed to reproduce human color vision
RGB to H, S, and V. The ROI was manually selected and measured
based on a color matching experiment using human subjects. X, Y, and Z
(analyze > measure) and repeated for each channel that appears when
refer to the human tristimulus functions wherein X and Z represent the
the window is scrolled. Results are manually copied and pasted to
chromaticity functions, and Y axis as the luminance (Ibraheem et al.,
Microsoft Excel.
2012). Though less applicable now, CIE XYZ formed the basis for most
color spaces to date.
Since the discovery of CIE XYZ, many other color spaces have been
developed, though not many are commonly used in the present. Fig. 2a

Fig. 2. (a) Color spaces and their models. (b) Definition of a pixel from an image viewed on a digital monitor. Color in a pixel is a combination of 3 primary colors
which are RGB. However, when images are printed, color space needs to be replaced from RGB to CMYK to ensure accurate translation on commercial printers. (c)
Difference of the number of possible colors displayed among a full color, greyscale, and black and white image, assuming 8-bit per channel for both full color and
greyscale image. (d) Image quality can be defined by its resolution, commonly measured as dots-per-inch (dpi). The higher the dpi, the sharper the image looks. (For
interpretation of the references to color in this figure legend, the reader is referred to the Web version of this article.)

4
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

depicts the most common color spaces and their representation models. greyscale image should not be confused with a black and white image.
Table 2 details their type, dependence on a device, and applications. The The latter is a binary image, capable of only showing 2 colors, whereas a
main differences among these color spaces are primarily on the inclusion greyscale image can show multiple colors depending on the bit value
of either or both chromatic signals and luminance/brightness in an axis, (Fig. 2c). Finally, image quality is also defined by its resolution, typically
and its dependency on a device. The CIE color spaces (CIELAB or LAB, measured in pixels-per-inch (ppi) in digital images or dots-per-inch (dpi)
CIELUV, CIEXYZ) and HSV (Hue, Saturation, Value) account for both in printed images. Images with high dpi appear sharper as shown in
factors and separate the chroma and luminance in different axes, Fig. 2d.
whereas RGB and CMYK (Cyan, Magenta, Yellow, Key/Black) color Albeit the numerous color spaces available, all of them are correlated
spaces use the combination (whether by addition or subtraction) of each through mathematical relationships and are interchangeable to some
primary color in various luminance to yield a certain color. extent. Some are not fully interchangeable when viewing digitally as
As shown in Fig. 2a, the HSV color model represents color hue using they are dependent on the device used to view the image (Table 2). For
an angular dimension, value or lightness with the y-axis, and saturation example, if an image that has a 24-bit color depth is viewed on a com­
or depth of color with the cylinder radius. A higher H value corresponds puter screen that has a 16-bit color depth, then the audience will not be
to more yellow and a lower H means bluer. CIELAB expresses color using able to view the true color of the image. However, it is worth noting that
lightness (L) which is similar to the V in HSV but with a different scale, although images with high dpi and bit can store the highest number of
and A (a*) and B (b*) indicate the amount of green-red and blue-yellow, colors, they also require larger storage space. Hence, a compromise
respectively. YUV is used in digital and video systems and was created to needs to be made during image generation to ensure color information is
digitally encode color information to suit video and image compression preserved for its intended purpose.
and transmission formats. sRGB (standard RGB) is an additive color
mixing space and was created for use on the internet or computer 3.2. Overview of Trigit web app
viewing as phosphors in digital monitors emit RGB to form a pixel. In
contrast, CMYK is a subtractive color space primarily used in the To better design Trigit, we identified several features that meet the
printing industries and has fewer possible colors compared to RGB. Later needs of potential users including:
versions of RGB such as Adobe RGB and Adobe Wide Gamut RGB were
developed to ensure perceivable colors on the digital display can be ✓ Free and open-source tool to enable widespread use. It is impera­
reproduced using CMYK printers. The colors we perceive in physical tive in the scientific community as it ensures transparency, extensi­
prints are made up of numerous small dots with various compositions bility, and reproducibility (Swedlow and Eliceiri, 2009).
and sizes of cyan, magenta, yellow and black dots overlaid on top of each ✓ Compatible with common operating systems (OS). As Trigit is a
other (Fig. 2b). web app, its operation is not dependent on OS types. This is benefi­
In digital image viewing, each of the image we see on a computer cial as problems associated with ImageJ crashing in Mac OS is well-
screen is a combination of thousands of pixels containing one or multiple known (ImageJ, 2022; Apple Developer, 2022).
color channels. A greyscale image only contains 1 channel which varies ✓ Compatible with common web browsers, including Google
light intensity to display shades of grey. On the other hand, a full color Chrome, Microsoft Edge, and Safari.
image is comprised of the 3 color channels, which blend to create other ✓ Data processing in various color spaces. Different color space is
colors perceived by the eye as shown in Fig. 2b. The number of possible used for different applications. However, the most common spaces in
colors each channel can display depends on the bit number. For analytical chemistry are RGB, HSV, and CIELAB (Eaidkong et al.,
example, 1-bit channel shows only 2 colors (21), either black or white, 2012; Pumtang et al., 2011; Cantrell et al., 2010; Sharifzadeh et al.,
whereas 8-bit channel can show 256 (28). Each color channel typically 2014). To enable translation for printing and web viewing purposes,
contains 8-bit. Thus, a full color image containing 3 color channels we also included HEX (or hexadecimal) color codes and CMYK
(RGB) is a 24-bit image and can display over 16 million colors (28x3). A model.
✓ Robust. Trigit can process various shapes, image types, and quality
(bit, dpi, color spaces of source file).
Table 2
Common color spaces, model types, device dependencies, and examples of their
✓ Automatic data formatting on Microsoft Excel. Users can directly
applications. copy and paste the displayed data on Microsoft Excel. Each data is
pasted in a single cell, so no further formatting is necessary.
Color space Model type Device Applications
dependent
✓ Smartphone operation. To enable on-site measurements, Trigit can
also be accessed using smartphones (Fig. S2, Supporting
RGB R: Red Additive Yes Digital graphics, image
Information).
G: Green processing/analysis/
B: Blue storage ✓ Access without internet. Trigit can be saved into a portable hard
CMYK C: Cyan Subtractive Yes Physical prints drive so it can be used in areas with no internet (Fig. S3, Supporting
M: Magenta Information).
Y: Yellow
K: Black
HSV H: Hue Cylindrical Yes Digital graphics design
S: Saturation and processing,
V: Value computer vision, image Table 3
analysis, human visual Summary of Trigit web app features and compatibilities.
perception, image
Features Included
editing software, video
editor Image formats PNG, JPEG, TIFF
YUV Y: luminance Bicone Yes Digital video, video Color spaces and range • RGB (0–255)
U,V: systems values • CMYK (0–100%)
chromaticity • CIELAB (0–100% for L, -110 – 110 for A and B).
CIELAB L: Lightness – No Color matching system, Illuminant D50 2◦ .
A, B: Chroma advertising, evaluation • HSV (H: 0–360, S: 0–255, V: 0–255)
and hue of color difference, • HEX (00 – FF in hexadecimal or 0–255 in decimal)
graphic arts, digitized or Image resolution Range evaluated in this paper is 75–1200 dpi.
animated images, Compatibility Independent of OS. Web browsers that we have evaluated
multimedia include Google Chrome, Microsoft Edge, and Safari.

5
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

Based on the identified needs above, Trigit is designed to contain result or row is generated, the whole results table is automatically
features in Table 3. copied to the clipboard. This data is also automatically formatted such
that it is readily pasted into a preferred data storage method, such as a
3.2.1. UI design table or Microsoft Excel. Fig. 3, Tool 5 (Copy) was added in case the user
Trigit was designed to display a simple and intuitive user interface to accidently copied another item in a different window and lost the data
streamline the color extraction process and enable use by non- that was originally automatically copied into the clipboard. We also
specialized users. Fig. 3 shows the user interface of Trigit and the included an “Export” button (Fig. 3, Tool 6) so that users can export or
description of each button. In addition, tool tip to guide users during use download the images shown in the bottom panel as PNG files. User guide
will also be shown when each button is hovered over. Trigit workflow video is available in Supporting Information.
involves four main steps: (i) image upload, (ii) determination of selec­ Trigit can also be operated on smartphones simply by accessing the
tion tool, (iii) selection of ROI, and (iv) data management. After each website (Fig. S2, Supporting Information). Trigit smartphone use is

Fig. 3. Trigit user interface and short descriptions of each tool.

6
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

currently only limited to Android users because Apple’s open-source sensitivity percentage (Fig. 3, Tool 9), which is implemented using
browser engine called WebKit significantly limits features development. OpenCV’s ‘floodFill’ function (OpenCV Miscellaneous Image Trans­
formations). This tool is useful when selecting an area with unique
We understand that research data is confidential in nature. Although
shapes but has similar hues. Autoselection tool includes the neigh­
users need to upload their images for processing, this web app only pro­
bouring pixels from the selected pixel (single pixel) only if the color
cesses data in real time and does not store data in any way. Data will be
value is within the sensitivity range defined by the user. Furthermore, it
cleared and non-retrievable once the page is refreshed. Thus, we recom­
is also worth noting that autoselection tool only includes areas with
mend that processed data is copied to personal inventory immediately.
similar hues if there is no space or region with significantly different
color values. Of note, a similar task can be performed in other software,
3.2.2. 4-step color extraction process such as Adobe Photoshop using their masking and/or Select Color Range
Unless specified, all data generated was processed on Google Chrome command. However, these features are hidden and are difficult to use,
operated on Windows 10 Pro. Analysis using Trigit only took <5 s for even for trained users. In addition, it requires users to isolate each sensor
one spot. Trigit also automatically formats results in a table that can be area in a separate image before any processing can occur.
copied and pasted on Excel directly. Cropped images of each output zone Fig. 5 details the way the autoselection tool works. Equations (1)–(4)
can be saved as PNG if desired. We compared Trigit’s performance with were used to determine the included pixels. The following equations are
ImageJ. We were unable to select the exact number of pixels (area) used shown based on R color value. Selected pixel is defined as the single
in ImageJ measurements. However, values of the “area” displayed in pixel selected by the user at (x,y) coordinate within the image (Fig. 5a).
ImageJ’s results window were comparable to the number of pixels or For a pixel to be selected, all the R, G, and B values of the neighbouring
“pixel” in Trigit. This discrepancy would not affect Trigit performance pixel (Fig. 5b) need to be within the acceptable value set by the sensi­
provided a similar area is selected. It is worth noting that, similar to tivity percentage. After the first iteration, the included pixels based on
ImageJ, Tool 11 (Fig. 3) in Trigit also allows users to consistently the selected pixel is shown in Fig. 5c. This process is repeated such that
analyze similar sized area and number of pixels in each region of in­ the region of selected pixels expands gradually from the initial pixel,
terest. Fig. 4 depicts the simple 4-step protocol to extract colors using until no more adjacent pixels are within the prescribed color range. In
Trigit. In addition, if multiple image analysis is required for data com­ this particular example, when a sensitivity value of 97% was set, the
parison, Trigit can be opened in a new browser tab or window. This is final region selected by Trigit is shown in Fig. 5d.
similar to ImageJ’s File > Open > Select image steps.
limit value = 255 − (sensitivity percentage (%) × 255) Equation (1)
3.2.3. Autoselection tool (sensitivity formula)
upper limitR = selected pixelR + limit value Equation (2)
A useful feature of Trigit is an autoselection tool with tunable

Fig. 4. Trigit workflow includes 4 main steps: (1) image upload, (2) determination of selection tool, (3) selection of ROI, and (4) data management.

7
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

Fig. 5. Autoselection tool working mechanism. (a) The user first selects a pixel which RGB value will be used as a reference. The floodfill function then calculates the
limit range of each RGB depending on the user-defined sensitivity percentage. (b) The RGB values of the neighbouring pixels are evaluated and the range values for
each RGB is determined. (c) At the final stage, the function selects the neighbouring pixel only if all RGB values fall within the specific range. (d) The final area
selected by Trigit is highlighted within the neon green boundary box. (For interpretation of the references to color in this figure legend, the reader is referred to the
Web version of this article.)

lower limitR = selected pixelR − limit value Equation (3) in the RGB and CMYK color spaces. We created Fig. 6 using Adobe
Illustrator and each circle is set at 100% or the highest value possible in
included pixelR = pixels within [lower limitR ≤ selected pixelsR ≤ upper limitR ] the color space (e.g: 100% Cyan for full cyan in CMYK or [255,0,0] in
Equation (4) RGB for full red). The generated values were compared with those from
ImageJ. ImageJ is unable to generate CMYK color space. As CMYK is a
Thus, based on Equation (4), a sensitivity value of 100% will only subtractive color model (Fig. 2a), each color when converted to RGB
select neighbouring pixels with exactly the same R, G, and B values. It is color space is C [0,255,255], M [255,0,255], Y [255,255,0], and K
very rare for images generated by scanning or photography to have a [0,0,0]. Table S1 (Supporting Information) shows no difference in values
significant area with pixels of identical colors. This is common in generated from Trigit and ImageJ (RGB analysis) using the circle and
computer-generated image (CGI). rectangle selection tool.
It is worth noting that when the autoselection tool is used, a slight
difference will be observed as shown in Table S1 Supporting Information
3.3. Color calibration test and validation against ImageJ
(autoselect 50% data). This is because Fig. 6 is a computer-generated
image (CGI) and not by photography or scanner. All CGIs have an
We first tested the accuracy of Trigit to generate each primary color

8
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

is crucial to set image parameters with a clear knowledge of its end-use.


The three most common image formats in the scientific community
are PNG, JPEG, or TIFF. Each varies depending on the level of
compression which affects the image quality. PNG and TIFF are lossless
formats, which contain the greatest level of quality. TIFF retains the
highest amount of detail and is preferred for printing and publishing.
However, it is the least used for web viewing owing to its generally large
image size, which extends loading time. PNG is preferred when a
transparent background is required. On the other hand, JPEG is a lossy
compression format and is preferred for device or web viewing as it has a
smaller image size and thus, loads faster. It is worth noting that when a
PNG/TIFF file is converted to JPEG, some details will be lost.
Fig. 6. Reference image used for calibration created using Adobe Illustrator. Herein, we evaluate the effects of scanning the same sensor image in
Top row from left to right is 100% Cyan, 100% Magenta, 100% Yellow, and various file formats (PNG, JPEG, TIFF) and resolution (75, 150, 300,
100% Black. The bottom row from left to right is full red (255,0,0), full green 600, 1200 dpi) on color values and the number of pixels. Of note, Trigit
(0,255,0), and full blue (0,0,255). CMYK in RGB color space is C [0,255,255], M can also process images captured by smartphones or any cameras as long
[255,0,255], Y [255,255,0], and K [0,0,0]. (For interpretation of the references as the image format is PNG, JPEG, or TIFF. We also evaluate Trigit’s
to color in this figure legend, the reader is referred to the Web version of autoselection tool capability to select sensors with various hues and
this article.) saturation levels, with multiple or single-color shades in a spot.
Fig. 8 shows the sensor image that we produced for each evaluation.
anti-aliasing applied. Anti-aliasing is applied to smooth the jagged edges Results in Table 6 show that scanning resolution affected the number of
of a shape by averaging the colors of the pixels at the boundary region pixels in each area, whereby a lower resolution image has a lower
(Jiang et al., 2014). Fig. 7a illustrates the significance of anti-aliasing a number of pixels as expected. However, the resolution did not affect the
digitally drawn circle. As can be seen, the edge of anti-aliased circle image’s color values as no difference in each R, G, and B value was
contains numerous pixels with smoother colors relative to the main found. These results are expected as higher resolution images resolve the
circle. reduction of the number of pixels by averaging the neighbouring pixels
To ensure only the black region is selected, the sensitivity bar of the (Parker et al., 2020). Although lower dpi images are preferred due to the
autoselection tool should be adjusted to a higher value. A higher smaller file size, resolutions above 300 dpi is usually recommended to
sensitivity value means that for a pixel to be selected, the difference ensure sufficient details are preserved especially for images that have
between the adjacent pixels must be minimized based on Equations (1)– multiple colors and outlines. 300 dpi is also usually the minimum dpi
(4). To evaluate the effects of adjusting the sensitivity bar, we used the requested for figures submitted to journal publications (Nature, 2022;
100% Black or RGB [0,0,0] area in the reference image (Fig. 6). The American Chemical Society, 2022; Wiley, 2022). When analyzing im­
generated results are shown in Table 4. Higher sensitivity reduced the ages of different formats, results in Table 6 show that image format did
number of pixels selected as the allowable pixel value difference is not vary color values despite having different pixel numbers. There is
smaller. Note that for 0% sensitivity, all the pixels in the image are currently no consensus on image format for scanning in the scientific
selected and the resultant averaged color grey was generated (Figs. 7b community. However, PNG is usually selected to retain reasonable
and 0% sensitivity). The enlarged images showing the edges of the cir­ image quality without overloading storage memory.
cles at various sensitivity are shown in Fig. 7b. We evaluated the performance of the autoselection tool by varying
the sensitivity (SST) values. Spot S1 – S3 in Fig. 8a was used to evaluate
3.4. Effects of web browser and operating system the sensitivity tool against spots with similar hues (blue) but varying
saturation levels. Fig. 8d was used to evaluate its performance when
To ensure Trigit can be used regardless of the device and browser, we used in other hues. Fig. 8e was used to demonstrate the ability of the
evaluated Trigit performance on common operating systems and web autoselection tool to selectively exclude areas with different colors to
browsers. These include Mac OS (Mojave, version 10.14.5) Safari web that of the pixel selected. Tables S2–S5 (Supporting Information) list the
browser, Windows 11 Pro Google Chrome, and Windows 11 Pro colorimetric values and Fig. 9 depicts the area selection as shown in
Microsoft Edge. Selection tool was autoselection (100% sensitivity). Trigit bottom panel. The center of each spot is selected, except for Fig. 8e
Image analyzed was the 100% Cyan circle in the calibration image where both middle and edge pixel were selected as multiple colors can
(Fig. 6). As can be seen in Table 5, Trigit can be used across various OS be visibly seen within the spot. The selected area used for calculation in
and web browsers without affecting the results. Trigit is bound by the green border. Note that SST 100% works as a
single pixel color picker. Hence, this would only show the color of a
3.5. Effects of image format, resolution, and sensor hue and saturation single pixel unless all neighbouring pixels have identical values. This is
excluded from our evaluation as it is useful only for CGIs or vector image
Images can be created in different formats and resolutions depending with even color.
on their applications. Image resolution, measured in dpi, describes the Fig. 9a shows that Trigit autoselection tool at 90% and 85% SST were
amount of detail in an image. Resolution and color are related to each able to select the area accurately regardless of the saturation level (S1 –
other by pixels, which can be considered as the smallest building blocks S3) and hue (H1 – H4). We also highlight that Trigit could detect
of an image. Each pixel can store a specific number (or bit) of colors, also abnormal regions within H2 and automatically exclude this from the
referred as color depth/bit depth. A higher resolution image is bigger in averaged values. The effect of sensitivity on selection area is evident
size and storage memory as it contains more pixels. When a high dpi when used against sensors with low saturation (fade) colors. For
image is compressed to a low dpi image whilst maintaining the physical example, at 80% SST, both H1 and S1 cannot be detected, then H2 and
image size, the neighbouring pixels will average their colors to make a S2 were undetected at 70% SST. We also evaluated autoselection SST
new pixel to ultimately reduce the total number of pixels. For example, ranges against sensors with partial (uneven) color change. Fig. 9b and c
standard screen resolution is 72 dpi, whereas higher resolutions such as shows the selection area when the center and edge of the sensor picture
300 dpi is usually required for prints (magazines, banners, posters, etc). were selected, respectively. 90% SST excluded a significant area from
If a 72-dpi image is used for prints which physical size is larger than the the selection. Notably, the central region in P2 was distinguished
72-dpi image, the print will appear pixelated and poor quality. Hence, it effectively (Fig. 9b), as were the edge region (Fig. 9c, P2e). However, in

9
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

Fig. 7. (a) Anti-aliased and non anti-aliased edges on digitally drawn circles. Anti-aliasing creates grey pixels to smooth the edges of a circle. (b) Effects of altering
sensitivity percentage on autoselection tool when selecting 100% Black circle from the reference image. Higher sensitivity lowers the boundary region so that grey
pixels used for anti-aliasing are not selected.

Table 4
RGB values generated using autoselection tool set at different sensitivity values. Region selected was 100% Black in the reference image (Fig. 6). Higher value means
the difference between each RGB value must be minimum for the pixel to be selected. Standard deviation error between 4 repeats is presented (n = 4).
Details Sensitivity (%) Mean Stdev Pixels

R G B R G B

100% Black Autoselection tool 0 193 193 193 0 0 0 452,864


25 2 2 2 0 0 0 26,760
50 1 1 1 0 0 0 26,627
75 0 0 0 0 0 0 26,513
100 0 0 0 0 0 0 26,330

P1 and P3, where separation rings are either fade or not clear, Trigit was select a region with similar color only if the area is connected and has no
unable to identify the difference between central and edge selection. In space in between with significantly different colors. Such objects with
such cases, if the central region is of interest, we recommend using the spaces in between typically require a masking function that exists in
circle selection tool. It is worth noting that Trigit autoselection tool can Adobe Photoshop. Masking function is beyond the scope of this study.

10
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

Table 5 and has recently been used in scientific research due to its efficient
Color values generated from Trigit using different web browsers and operating image segmentation and classification of color image data (Kurniastuti
systems. Standard deviation is represented as errors between 4 repeats. Number et al., 2022). CMYK is used in the printing industry, whereas hex is used
of pixels for all 4 repeats is 26,333 pixels (error 0%) using the autoselection tool for web viewing.
at 100% sensitivity. Analysis of non-RGB color spaces in ImageJ involves the conversion
OS Browser Mean Stdev of RGB image to the desired color space using a downloaded plugin. For
R G B R G B example, Color Transformer or Color Space Converter, then each of the R,
G and B channel produced is measured using Analyze > Measure, as
Windows 11 Pro Google Chrome 0 255 255 0 0 0
Microsoft Edge 0 255 255 0 0 0 done in prior studies (Parker et al., 2020; Logger et al., 2020). To convert
Mac OS Safari 0 255 255 0 0 0 image to HSV, CMYK or CIELAB color space, Color Transformer plugin
was used (Plugin > Analyze > Color Transformer > Set “To color space”
to either HSV, CMYK or Lab). From each image generated (i.e: 3 for HSV,
4 for CMYK and 3 for CIELAB), individual measurements were con­
ducted in each ROI using Analyze > Measure. Of note, ImageJ’s HSV
measurements are presented in a different scale (i.e: H, S and V in 0–1) to
that of Trigit (i.e: H is between 0 and 360◦ , S and V between 0 and
100%). Hence, all H values generated from ImageJ need to be multiplied
by 360, and S and V to be multiplied by 100. Tables S6–S9 (Supporting
Information) show the results generated using ImageJ and Trigit for
each color space RGB, CMYK, CIELAB and HSV. Measurements were
conducted on calibration image (Fig. 6). RGB, CMYK and HSV values
were identical. However, CIELAB showed slightly differing values
(1–27%). This is because ImageJ’s Color Transformer plugin used a
different illuminant (i.e: white reference point) to that of Trigit when
converting a RGB stack to LAB stack. ImageJ uses D65 whereas Trigit
uses D50. D50 is common in the printing industry and is recommended
by the International Color Consortium, whereas D65 is used for digital
viewing. D65 is more blue-biased, while D50 resembles a warm daylight
(Seymour, 2022).
We estimate Trigit’s total RGB color extraction time for 7 ROI is
15–20 s for trained users or up to 40 s for non-trained users, although the
Fig. 8. (a) Blue polydiacetylene sensor at various color saturation levels from
first use may require up to 1.5 min. Compared to ImageJ, RGB color
least S1 to most S3 saturated blue color. (b) Spot S3 was scanned at different
extraction is 4–9 times longer, that is ~90 s for regular users and up to 3
image resolutions (dpi) to evaluate the effects of resolution on color values. (c)
Spot S1 was scanned at different image formats to demonstrate Trigit capabil­
min for new users. The key benefit of Trigit compared to ImageJ is the
ities to process various formats. (d) Polydiacetylene sensors with different hues ability to extract other color spaces apart from RGB in a straightforward
with even colors across the whole spot area. (e) Polydiacetylene sensors with manner, without having to manually split each channel and conduct
partial (uneven) color change. (For interpretation of the references to color in measurements separately. For example, the time required to extract
this figure legend, the reader is referred to the Web version of this article.) CMYK in Trigit remains the same (i.e: 15–20 s for 7 ROI), whereas 10–15
times longer (i.e: 3–5 min) is required in ImageJ. Note that, non-RGB
3.6. Trigit color extraction compared to ImageJ color extraction in ImageJ is also only possible if the user has been
trained and/or has conducted intensive research on its method (i.e:
Our two main goals when developing Trigit are to ensure it is more video tutorial or articles).
rapid than ImageJ regardless of the color space, and to ensure it is
intuitive even for non-specialized and untrained users. ImageJ is rela­ 4. Conclusions
tively straightforward for measuring RGB (Select region > Plugins >
Analyze > RGB Measure). However, there is no direct way to obtain We have developed Trigit as a free, rapid and intuitive web app to
values from other color spaces. Fig. 10 contrasts the difference in steps extract color information to streamline the current image processing
involved to procure non-RGB values using ImageJ versus Trigit. analysis. Rather than relying on the highly subjective human vision,
Different color spaces may be preferred in specific fields. For example, extracting and quantifying color values from images enable objective
the food industry prefers CIELAB as color differences can be correlated measurement of color change. Human vision is unreliable especially
with human visual perception by calculating the Euclidian distance when color matching is required for analyte quantification. This is
(Sharifzadeh et al., 2014). HSV is common in graphic design industry because each person has slightly different numbers of photoreceptor
cells, which changes the way color is perceived. Objective color

Table 6
Color values generated from Trigit using images with different resolutions and formats. Standard deviation is represented as errors between 4 repeats.
Area Format Resolution (dpi) Mean Stdev px mean px stdev

R G B R G B

S3 PNG 75 170 198 229 0.4 0.2 0.1 299 32


150 170 198 229 0.2 0.1 0.1 1221 118
300 170 199 230 0.3 0.1 0.1 4958 605
600 169 199 230 0.3 0.1 0.0 20,006 2091
1200 169 199 231 0.1 0.0 0.0 79,173 3790
S1 PNG 300 210 237 247 0.1 0.0 0.0 8090 313
JPG 300 208 235 246 0.1 0.1 0.0 7686 315
TIFF 300 207 234 245 0.1 0.1 0.1 7155 549

11
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

Fig. 9. Trigit autoselection tool performance at various sensitivity percentages when tested against spots with (a) various hues and saturation and on spots with
partial uneven color change when selected (b) in the center and (c) on the edge of the spot. Grey box means no selection is possible (i.e: the whole image was selected
and generated values are invalid). Low sensitivity values are not recommended for colors with low saturation. Selected area is bound within the neon green lines. (For
interpretation of the references to color in this figure legend, the reader is referred to the Web version of this article.)

measurement has been used and found useful in various scientific fields in Trigit involves 4 main steps, including (i) image upload, (ii) deter­
to ameliorate errors due to variable user interpretations of the results. mination of selection tool, (iii) selection of ROI, and (iv) data export.
For example, measuring the color change in colorimetric sensors could Values generated are indifferent to that of ImageJ, which demonstrates
inform users about the presence and concentration of target analytes. its ability to replace ImageJ color analysis. In cases where on-site
However, colorimetric sensors exist in variable and often complex test analysis is required, Trigit can also be accessed using smartphones.
zones geometries and color (hue and saturation), which make color Trigit can also be downloaded and stored in portable drives to enable use
extraction process challenging. without internet access.
Undeniably, there is a plethora of research focusing on improving the This web app is useful for the scientific community who requires
performance of colorimetric sensors. However, minimum attention is large scale and rapid extraction of color information. For example, those
given to improving the data collection and processing stage, which who work on colorimetric sensors, where the color before and after
process takes a significant portion of researchers’ time. This is important analyte exposure needs to be measured. More importantly, Trigit auto­
to address as researchers work with thousands of data and shortening selection tool is useful to process sensors with complex shapes as it
the processing time could accelerate the progression of the scientific automatically isolates area within a boundary region.
research itself. Of note, this web app is specifically designed for scientific
audiences working on colorimetric sensors to accelerate the otherwise Author contributions
time-consuming color extraction process.
In this work, we have designed a standalone platform in the form of a ADT conceptualized the work, developed methodologies, performed
web app, instead of the conventional downloadable software or plugin experiments, analyzed the data, and wrote and revised the manuscript.
to eliminate the need of installations prior to use. Although ImageJ has TH developed the program. RC supervised the project and revised the
been used widely to extract color information, it is slow and limited manuscript. The manuscript was written through the contributions of all
when analyzing non-RGB color spaces. Herein, we presented Trigit, authors. All authors have given approval to the final version of the
which can extract RGB values and non-RGB values (i.e: CMYK, HSV, manuscript.
CIELAB, HEX) 4–9 times and 10–15 times faster than ImageJ, respec­
tively. Trigit is a web app, making it accessible anywhere and at all times Declaration of competing interest
without needing to download software beforehand, so long that there is
an internet connection. In addition, being a web app means that Trigit The authors declare that they have no known competing financial
operation is independent of operating systems and web browsers. Trigit interests or personal relationships that could have appeared to influence
is equipped with autoselection tool with tunable sensitivity percentage the work reported in this paper.
to streamline automated area selection regardless of its shape. Autose­
lection tool can selectively pick neighbouring pixels whose color is
within the sensitivity range that was selected by the user. Color analysis

12
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

Fig. 10. Comparison between the methods involved in extracting non-RGB color space in (a) Trigit and (b) ImageJ. CMYK is used in this example. With Trigit, only 1
selection is required per 1 ROI to obtain color values, whereas in ImageJ 1 ROI requires 4 measurements for each channel (C, M, Y, and K). (For interpretation of the
references to color in this figure legend, the reader is referred to the Web version of this article.)

Data availability References

Data will be made available on request. Alizadeh, N., Salimi, A., Hallaj, R., 2019. Mimicking peroxidase-like activity of Co3O4-
CeO2 nanosheets integrated paper-based analytical devices for detection of glucose
with smartphone. Sensor. Actuator. B Chem. 288, 44–52.
Acknowledgment Arena, E.T., Rueden, C.T., Hiner, M.C., Wang, S., Yuan, M., Eliceiri, K.W., 2017.
Quantitating the cell: turning images into numbers with ImageJ. Wiley
Interdisciplinary Reviews: Dev. Biol. 6 (2), e260.
ADT acknowledges the support from the University of New South Arnal Barbedo, J.G., 2013. Digital image processing techniques for detecting, quantifying
Wales Scientia PhD Scholarship. RC acknowledges the support from the and classifying plant diseases. SpringerPlus 2 (1), 660.
National Health and Medical Research Council Emerging Leadership Cantrell, K., Erenas, M.M., de Orbe-Payá, I., Capitán-Vallvey, L.F., 2010. Use of the hue
parameter of the hue, saturation, value color space as a quantitative analytical
Investigator Grant (NHMRC APP1173428) and the UNSW Scientia parameter for bitonal optical sensors. Anal. Chem. 82 (2), 531–542.
Fellowship. Chen, G., Fang, C., Chai, H.H., Zhou, Y., Yun Li, W., Yu, L., 2019. Improved analytical
performance of smartphone-based colorimetric analysis by using a power-free
imaging box. Sensor. Actuator. B Chem. 281, 253–261.
Appendix A. Supplementary data Cho, E., Jung, S., 2018. Biomolecule-functionalized smart polydiacetylene for biomedical
and environmental sensing. Molecules 23 (1), 107.
Supplementary data to this article can be found online at https://doi. Choi, S., Kim, S.-K., Lee, G.-J., Park, H.-K., 2015. Paper-based 3D microfluidic device for
multiple bioassays. Sensor. Actuator. B Chem. 219, 245–250.
org/10.1016/j.biosx.2023.100361.

13
A.D. Tjandra et al. Biosensors and Bioelectronics: X 14 (2023) 100361

Chung, S., Park, T.S., Park, S.H., Kim, J.Y., Park, S., Son, D., Bae, Y.M., Cho, S.I., 2015. OpenCV Miscellaneous Image Transformations. https://docs.opencv.
Colorimetric sensor array for white wine tasting. Sensors 15 (8). org/3.4/d7/d1b/group__imgproc__misc.html#ga366aae45a6c1289b341d140
de Oliveira, T.V., Soares, N.d.F.F., Coimbra, J.S.d.R., de Andrade, N.J., Moura, L.G., 839f18717.
Medeiros, E.A.A., de Medeiros, H.S., 2015. Stability and sensitivity of Park, D.-H., Heo, J.-M., Jeong, W., Yoo, Y.H., Park, B.J., Kim, J.-M., 2018. Smartphone-
polydiacetylene vesicles to detect Salmonella. Sensor. Actuator. B Chem. 221, based VOC sensor using colorimetric polydiacetylenes. ACS Appl. Mater. Interfaces
653–658. 10 (5), 5014–5021.
Doerflinger, A., Quang, N.N., Gravel, E., Ducongé, F., Doris, E., 2019. Aptamer-decorated Parker, R.W., Wilson, D.J., Mace, C.R., 2020. Open software platform for automated
polydiacetylene micelles with improved targeting of cancer cells. Int. J. Pharm. 565, analysis of paper-based microfluidic devices. Sci. Rep. 10 (1), 11284.
59–63. Parolo, C., Sena-Torralba, A., Bergua, J.F., Calucho, E., Fuentes-Chust, C., Hu, L.,
Dolai, S., Bhunia, S.K., Beglaryan, S.S., Kolusheva, S., Zeiri, L., Jelinek, R., 2017. Rivas, L., Álvarez-Diduk, R., Nguyen, E.P., Cinti, S., Quesada-González, D.,
Colorimetric polydiacetylene–aerogel detector for volatile organic compounds Merkoçi, A., 2020. Tutorial: design and fabrication of nanoparticle-based lateral-
(VOCs). ACS Appl. Mater. Interfaces 9 (3), 2891–2898. flow immunoassays. Nat. Protoc. 15 (12), 3788–3816.
Eaidkong, T., Mungkarndee, R., Phollookin, C., Tumcharern, G., Sukwattanasinitt, M., Phonchai, N., Khanantong, C., Kielar, F., Traiphol, R., Traiphol, N., 2019. Low-
Wacharasindhu, S., 2012. Polydiacetylene paper-based colorimetric sensor array for temperature reversible thermochromic polydiacetylene/zinc(II)/Zinc oxide
vapor phase detection and identification of volatile organic compounds. J. Mater. nanocomposites for colorimetric sensing. ACS Appl. Nano Mater. 2 (7), 4489–4498.
Chem. 22 (13), 5970–5977. American Chemical Society. Author Guidelines. https://publish.acs.org/publish/auth
Fain, G., Sampath, A.P., 2018. Rod and cone interactions in the retina. F1000Res 7, or_guidelines?coden=accacs#figure_illustration_services. (Accessed 9 December
F1000. Faculty Rev-657. 2022).
He, J., Xiao, G., Chen, X., Qiao, Y., Xu, D., Lu, Z., 2019. A thermoresponsive microfluidic Pumtang, S., Siripornnoppakhun, W., Sukwattanasinitt, M., Ajavakom, A., 2011. Solvent
system integrating a shape memory polymer-modified textile and a paper-based colorimetric paper-based polydiacetylene sensors from diacetylene lipids. J. Colloid
colorimetric sensor for the detection of glucose in human sweat. RSC Adv. 9 (41), Interface Sci. 364 (2), 366–372.
23957–23963. Rueden, C.T., Schindelin, J., Hiner, M.C., DeZonia, B.E., Walter, A.E., Arena, E.T.,
Hosu, O., Lettieri, M., Papara, N., Ravalli, A., Sandulescu, R., Cristea, C., Marrazza, G., Eliceiri, K.W., 2017. ImageJ2: ImageJ for the next generation of scientific image
2019. Colorimetric multienzymatic smart sensors for hydrogen peroxide, glucose data. BMC Bioinf. 18 (1), 529-529.
and catechol screening analysis. Talanta 204, 525–532. Schneider, C.A., Rasband, W.S., Eliceiri, K.W., 2012. NIH Image to ImageJ: 25 years of
Ibraheem, N., Hasan, M., Khan, R.Z., Mishra, P., 2012. Understanding color models: a image analysis. Nat. Methods 9 (7), 671–675.
review. ARPN J. Sci. Technol. 2. Seymour, J., 2022. Color inconstancy in CIELAB: a red herring? Color Res. Appl. 47 (4),
Jia, M.-Y., Wu, Q.-S., Li, H., Zhang, Y., Guan, Y.-F., Feng, L., 2015. The calibration of 900–919.
cellphone camera-based colorimetric sensor array and its application in the Sharifzadeh, S., Clemmensen, L.H., Borggaard, C., Støier, S., Ersbøll, B.K., 2014.
determination of glucose in urine. Biosens. Bioelectron. 74, 1029–1037. Supervised feature selection for linear and non-linear regression of L*a*b* color
Jiang, X.-d., Sheng, B., Lin, W.-y., Lu, W., Ma, L.-z., 2014. Image anti-aliasing techniques from multispectral images of meat. Eng. Appl. Artif. Intell. 27, 211–227.
for Internet visual media processing: a review. J. Zhejiang Univ. - Sci. C 15 (9), Shim, J., Kim, B., Kim, J.-M., 2017. Aminopyridine-containing supramolecular
717–728. polydiacetylene: film formation, thermochromism and micropatterning. Supramol.
Jordan, G., Deeb, S.S., Bosten, J.M., Mollon, J.D., 2010. The dimensionality of color Chem. 29 (5), 395–400.
vision in carriers of anomalous trichromacy. J. Vis. 10 (8), 12-12. Swedlow, J.R., Eliceiri, K.W., 2009. Open source bioimage informatics for cell biology.
Kim, S.D., Koo, Y., Yun, Y., 2017. A smartphone-based automatic measurement method Trends Cell Biol. 19 (11), 656–660.
for colorimetric pH detection using a color adaptation algorithm. Sensors 17 (7), Tjandra, A.D., Weston, M., Tang, J., Kuchel, R.P., Chandrawati, R., 2021. Solvent
1604. injection for polydiacetylene particle synthesis – effects of varying solvent, injection
Kurniastuti, I., Yuliati, E.N.I., Yudianto, F., Wulan, T.D., 2022. Determination of Hue rate, monomers and needle size on polydiacetylene properties. Colloids Surf. A
Saturation Value (HSV) color feature in kidney histology image. J. Phys. Conf. 2157 Physicochem. Eng. Asp. 619, 126497.
(1), 012020. Tjandra, A.D., Pham, A.-H., Chandrawati, R., 2022. Polydiacetylene-based sensors to
Li, B., Li, X., Dong, Y., Wang, B., Li, D., Shi, Y., Wu, Y., 2017. Colorimetric sensor array detect volatile organic compounds. Chem. Mater. 34 (7), 2853–2876.
based on gold nanoparticles with diverse surface charges for microorganisms Troscianko, J., Stevens, M., 2015. Image calibration and analysis toolbox – a free
identification. Anal. Chem. 89 (20), 10639–10643. software suite for objectively measuring reflectance, colour and pattern. Methods
Li, Y., Sun, J., Mao, W., Tang, S., Liu, K., Qi, T., Deng, H., Shen, W., Chen, L., Peng, L., Ecol. Evol. 6 (11), 1320–1331.
2019. Antimony-doped tin oxide nanoparticles as peroxidase mimics for paper-based Wang, T.-T., Lio, C.k., Huang, H., Wang, R.-Y., Zhou, H., Luo, P., Qing, L.-S., 2020.
colorimetric detection of glucose using smartphone read-out. Microchim. Acta 186 A feasible image-based colorimetric assay using a smartphone RGB camera for point-
(7), 403. of-care monitoring of diabetes. Talanta 206, 120211.
Logger, J.G.M., de Jong, E., Driessen, R.J.B., van Erp, P.E.J., 2020. Evaluation of a simple Weston, M., Kuchel, R.P., Ciftci, M., Boyer, C., Chandrawati, R., 2020a.
image-based tool to quantify facial erythema in rosacea during treatment. Skin Res. A polydiacetylene-based colorimetric sensor as an active use-by date indicator for
Technol. 26 (6), 804–812. milk. J. Colloid Interface Sci. 572, 31–38.
Ly, B.C.K., Dyer, E.B., Feig, J.L., Chien, A.L., Del Bino, S., 2020. Research techniques Weston, M., Phan, M.A.T., Arcot, J., Chandrawati, R., 2020b. Anthocyanin-based sensors
made simple: cutaneous colorimetry: a reliable technique for objective skin color derived from food waste as an active use-by date indicator for milk. Food Chem. 326,
measurement. J. Invest. Dermatol. 140 (1), 3–12.e1. 127017.
ImageJ. MacOS. https://imagej.net/platforms/macos. (Accessed 9 December 2022). Weston, M., Kuchel, R.P., Chandrawati, R., 2020c. A polydiacetylene-based colorimetric
Mazur, F., Tran, H., Kuchel, R.P., Chandrawati, R., 2020. Rapid detection of listeriolysin sensor as an active use-by date for plant-based milk alternatives. Macromol. Rapid
O toxin based on a nanoscale liposome–gold nanoparticle platform. ACS Appl. Nano Commun. 41 (18), 2000172.
Mater. 3 (7), 7270–7280. Weston, M., Ciftci, M., Kuchel, R.P., Boyer, C., Chandrawati, R., 2020d. Polydiacetylene
Mazzone, P.J., Wang, X.F., Xu, Y., Mekhail, T., Beukemann, M.C., Na, J., Kemling, J.W., for the detection of α-hemolysin in milk toward the diagnosis of bovine mastitis. ACS
Suslick, K.S., Sasidhar, M., 2012. Exhaled breath analysis with a colorimetric sensor Appl. Poly. Mater. 2 (11), 5238–5248.
array for the identification and characterization of lung cancer. J. Thorac. Oncol. 7 Weston, M., Kuchel, R.P., Chandrawati, R., 2021. Digital analysis of polydiacetylene
(1), 137–142. quality tags for contactless monitoring of milk. Anal. Chim. Acta 1148, 238190.
Mazzone, P.J., Wang, X.-F., Lim, S., Choi, H., Jett, J., Vachani, A., Zhang, Q., Weston, M., Pham, A.-H., Tubman, J., Gao, Y., Tjandra, A.D., Chandrawati, R., 2022.
Beukemann, M., Seeley, M., Martino, R., Rhodes, P., 2015. Accuracy of volatile urine Polydiacetylene-based sensors for food applications. Mater. Adv. 3 (10), 4088–4102.
biomarkers for the detection and characterization of lung cancer. BMC Cancer 15 Wiley. Guidelines for Preparing Figures. https://authorservices.wiley.com/author-res
(1), 1001. ources/Journal-Authors/Prepare/manuscript-preparation-guidelines.html/figure-
Nature. Guide to Preparing Final Artwork. https://www.nature.com/documents/nature preparation.html. (Accessed 9 December 2022).
-final-artwork.pdf. (Accessed 9 December 2022). Woolf, M.S., Dignan, L.M., Scott, A.T., Landers, J.P., 2021. Digital postprocessing and
Newton, I., Innys, W., Innys, J., 1721. Opticks:: or, A Treatise of the Reflections, image segmentation for objective analysis of colorimetric reactions. Nat. Protoc. 16
Refractions, Inflections and Colours of Light. William and John Innys at the West End (1), 218–238.
of St. Paul’s. Xiao, G., He, J., Chen, X., Qiao, Y., Wang, F., Xia, Q., Yu, L., Lu, Z., 2019. A wearable,
Nguy-Robertson, A., Peng, Y., Arkebauer, T., Scoby, D., Schepers, J., Gitelson, A., 2015. cotton thread/paper-based microfluidic device coupled with smartphone for sweat
Using a simple leaf color chart to estimate leaf and canopy chlorophyll a content in glucose sensing. Cellulose 26 (7), 4553–4562.
maize (Zea mays). Commun. Soil Sci. Plant Anal. 46 (21), 2734–2745. Yoo, K., Kim, S., Han, N., Kim, G.E., Shin, M.J., Shin, J.S., Kim, M., 2018. Stepwise blue-
Nguyen, L.H., Naficy, S., McConchie, R., Dehghani, F., Chandrawati, R., 2019. red-yellow color change of a polydiacetylene sensor through internal and external
Polydiacetylene-based sensors to detect food spoilage at low temperatures. J. Mater. transitions. Dyes Pigments 149, 242–245.
Chem. C 7 (7), 1919–1926. Yoon, B., Park, I.S., Shin, H., Park, H.J., Lee, C.W., Kim, J.-M., 2013. A litmus-type
Nguyen, L.H., Oveissi, F., Chandrawati, R., Dehghani, F., Naficy, S., 2020. Naked-eye colorimetric and fluorometric volatile organic compound sensor based on inkjet-
detection of ethylene using thiol-functionalized polydiacetylene-based flexible printed polydiacetylenes on paper substrates. Macromol. Rapid Commun. 34 (9),
sensors. ACS Sens. 5 (7), 1921–1928. 731–735.
Park, I.S., Park, H.J., Kim, J.-M., 2013. A soluble, low-temperature thermochromic and Apple Developer. ImageJ on macOS Big Sur 11.0.1 Keep Crashing and I Cannot Work
chemically reactive polydiacetylene. ACS Appl. Mater. Interfaces 5 (17), 8805–8812. with it. https://developer.apple.com/forums/thread/668712. (Accessed 9 December
2022).

14

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy