IBK OWOADE Graduate Seminar
IBK OWOADE Graduate Seminar
Product development and innovation are critical components of organizational success, enabling
companies to create value, remain competitive, and achieve long-term sustainability. According
to a study by Cooper (2001), effective product development processes can significantly impact
an organization's performance by driving revenue growth, enhancing market share, and fostering
customer satisfaction. Innovation, on the other hand, involves the introduction of novel ideas,
technologies, or processes that lead to the creation of new products or improvements in existing
ones. Research by Davila et al. (2006) emphasizes the role of innovation in driving
One of the key ways in which product development and innovation contribute to organizational
performance is through the acquisition of market share and the attainment of competitive
advantage. Studies by Tellis et al. (2009) have shown that companies that invest in innovation
and product development tend to outperform their competitors in terms of market share and
profitability. By introducing innovative products that offer unique features or address unmet
customer needs, organizations can differentiate themselves in the marketplace and attract a larger
customer base. Furthermore, innovative products often command premium prices, allowing
companies to achieve higher profit margins and gain a competitive edge (Chen & Hambrick,
1995).
Product development and innovation also have a significant impact on customer satisfaction and
loyalty, which are critical drivers of organizational success. Research by Gunday et al. (2011)
suggests that organizations that consistently deliver innovative products are more likely to satisfy
customer needs and preferences, leading to higher levels of customer satisfaction and loyalty.
Innovative products that offer superior quality, enhanced functionality, or improved performance
are perceived positively by customers, resulting in stronger brand relationships and increased
repeat purchases (Jaworski & Kohli, 1993). Moreover, organizations that engage customers in
the product development process through co-creation or feedback mechanisms can better
understand their needs and preferences, leading to the development of more customer-centric
In today's rapidly evolving business landscape, organizational resilience and adaptability are
crucial for survival and growth. Product development and innovation play a vital role in enabling
competitive pressures. According to research by Teece (2018), companies that foster a culture of
challenges, and seize opportunities. By continuously exploring new ideas, technologies, and
market opportunities, organizations can enhance their agility, flexibility, and ability to innovate,
Innovation
introducing novel ideas, products, services, or processes that bring about positive change and
value creation (Tidd & Bessant, 2018). It involves the generation, adoption, and implementation
of new concepts or practices to address market needs, solve problems, and enhance
competitiveness (Rogers, 2003). Innovation can manifest in various forms, including product
(Damanpour, 1991). Scholars like Chesbrough (2003) emphasize the importance of open
innovation, which involves collaborating with external partners and stakeholders to leverage
Innovation plays a pivotal role in driving organizational performance and long-term success.
Research indicates a positive correlation between innovation and various performance indicators,
including financial performance, market share, and competitive advantage (Damanpour &
Aravind, 2012). Companies that invest in innovation initiatives tend to achieve higher
profitability, revenue growth, and return on investment compared to their less innovative
changing market dynamics, anticipate customer needs, and differentiate themselves from
improvement, organizations can enhance their resilience and agility in the face of uncertainty and
Several factors influence an organization's ability to innovate effectively. Internal factors such as
leadership support, organizational culture, and resource allocation play a critical role in fostering
visionary leadership, risk-taking, and support for experimentation, sets the tone for innovation
within the organization (Eisenbeiss et al., 2015). Moreover, organizations with a supportive
culture that encourages creativity, collaboration, and knowledge sharing are more likely to
succeed in innovation endeavors (Anderson et al., 2014). External factors such as market
creating opportunities and challenges that necessitate organizational adaptation (Tushman &
O'Reilly, 1996).
To leverage the benefits of innovation and enhance organizational performance, companies
employ various innovation strategies tailored to their objectives and contexts. These strategies
may include investing in research and development (R&D) to create new products or
technologies, collaborating with external partners to access new markets or expertise, and
2008). Additionally, organizations may adopt open innovation practices to tap into external
innovation efforts with business goals, organizations can optimize their innovation performance
Organizational performance
Organizational performance refers to the overall effectiveness and efficiency with which an
organization achieves its objectives and goals. It encompasses various dimensions, including
financial performance, operational efficiency, market share, customer satisfaction, and employee
productivity (Richard et al., 2009). High organizational performance is essential for long-term
market conditions, and meet stakeholder expectations (Hitt et al., 2007). Research by Prajogo
and Sohal (2006) highlights the significance of organizational performance as a key determinant
improve performance across different areas, organizations can enhance their overall
indicators that reflect the organization's achievements and effectiveness in different areas.
Financial indicators such as revenue growth, profitability, return on investment (ROI), and cash
flow are commonly used to evaluate financial performance (Kaplan & Norton, 1996).
Operational indicators, including productivity, efficiency, quality, and lead times, measure the
related metrics such as customer satisfaction, retention rates, and market share reflect the
organization's ability to meet customer needs and preferences (Anderson et al., 1994).
skill development assess the organization's effectiveness in managing its human resources and
fostering a conducive work environment (Pfeffer, 1994). By tracking these performance metrics,
organizations can identify areas of strength and improvement, set performance targets, and
Organizational performance is influenced by various internal and external factors that impact the
organization's ability to achieve its goals and objectives. Internal factors such as leadership
quality, organizational culture, strategic planning, and resource allocation play a critical role in
competitiveness (Schein, 2010). Strategic planning helps align organizational goals with market
resource allocation processes (Thompson & Strickland, 1995). External factors such as market
impact organizational performance by influencing market demand, industry trends, and business
operating conditions (Porter, 1980). By understanding these factors and their implications,
organizations can proactively manage risks, capitalize on opportunities, and enhance their
performance outcomes.
improvement methodologies such as Total Quality Management (TQM), Lean Six Sigma, and
Kaizen focus on eliminating waste, optimizing processes, and enhancing product or service
quality (Oakland, 2003). Innovation and technology adoption initiatives enable organizations to
develop new products, services, and business models that differentiate them from competitors
and create value for customers (Tidd & Bessant, 2009). Talent management practices such as
recruitment, training, performance management, and succession planning are essential for
attracting, developing, and retaining skilled employees who contribute to organizational success
(Boudreau & Ramstad, 2005). Strategic alliances, partnerships, and collaborations with external
stakeholders such as suppliers, customers, and industry peers can also enhance organizational
performance by leveraging complementary resources, capabilities, and expertise (Dyer & Singh,
1998). By implementing these strategies and fostering a culture of continuous improvement and
innovation, organizations can enhance their performance and achieve sustainable growth and
success.
Theoretical Review
Resource-based theory
how organizations achieve and sustain competitive advantage through the strategic management
of their resources and capabilities. According to this theory, firms gain competitive advantage by
possessing and leveraging unique, valuable, and difficult-to-imitate resources and capabilities.
Recent research in the field has expanded the scope of RBT to include dynamic capabilities,
which emphasize an organization's ability to adapt and reconfigure its resource base in response
to changing environmental conditions and competitive pressures. For instance, studies have
explored how firms develop dynamic capabilities through processes such as organizational
One recent study by Peteraf et al. (2020) examines the role of dynamic capabilities in shaping the
competitive advantage of firms in the technology industry. The research finds that firms with
superior dynamic capabilities are better able to identify and exploit new opportunities, navigate
industry disruptions, and sustain competitive advantage over time. Another study by Helfat and
Peteraf (2015) explores the mechanisms through which firms build and deploy dynamic
Makadok (2019) explores the nuanced understanding of resources by categorizing them into four
types: scarce, imperfectly mobile, specialized, and durable. This categorization aids in better
comprehending how different types of resources contribute to competitive advantage and how
they may interact within firms. Additionally, focusing on the dynamics of resource deployment,
a study by Eisenhardt and Martin (2020) highlights the significance of dynamic capabilities in
leveraging resources for competitive advantage. They emphasize that firms with superior
Barney and Zhang (2021) explore how institutional voids in emerging markets influence
resource-based advantages, suggesting that firms need to adapt their resource strategies to
navigate these unique environments successfully. Similarly, Hitt et al. (2020) investigate the role
collaborations and alliances can enhance firms' resource bases and competitive positions. These
studies underscore the continued relevance and evolution of resource-based theory in explaining
integrating insights from diverse disciplines and considering the interplay between internal
resources, capabilities, and external contexts, researchers advance our understanding of how
firms create and sustain competitive advantage in an increasingly complex and dynamic
landscape.
organization's ability to adapt, innovate, and reconfigure its resource base in response to
understanding in several key areas, including the microfoundations of dynamic capabilities, the
capabilities. Recent research in this area has examined how firms develop and deploy dynamic
capabilities across various contexts and industries. For example, Teece (2018) discusses the
highlighting the role of innovation, experimentation, and strategic agility in driving firm
performance. Another study by Zollo and Winter (2021) explores the microfoundations of
organizational adaptability and resilience. These studies shed light on the mechanisms through
which firms build and leverage dynamic capabilities to maintain competitiveness in turbulent
environments.
exploring the individual-level behaviors, routines, and cognitive processes that underpin
organizational adaptability. For instance, studies by Wilden et al. (2020) and Felin et al. (2019)
delve into the role of leadership, organizational culture, and employee mindset in fostering a
dynamic capability mindset within organizations. These studies highlight the importance of
aligning individual behaviors and organizational processes with the goals of adaptability and
which involves the simultaneous pursuit of exploration and exploitation activities. Teece (2016)
argues that dynamic capabilities enable firms to balance exploration (i.e., innovation and
sustained competitive advantage. Recent studies by Gupta et al. (2021) and He and Wong (2020)
explore how firms develop ambidextrous capabilities and the organizational mechanisms that
facilitate effective ambidexterity. In addition, with the rapid advancement of digital technologies,
capabilities. Teece (2018) discusses the concept of "digital dynamic capabilities," which refers to
an organization's ability to leverage digital technologies for innovation, agility, and competitive
advantage. Research by Zhu et al. (2021) and Srivastava et al. (2020) examines how firms build
digital capabilities and transform their business models to thrive in the digital age. Overall,
recent research in dynamic capability theory contributes to a deeper understanding of how
organizations adapt and thrive in turbulent environments. By investigating the micro foundations
digital transformation, scholars advance knowledge on how firms build and deploy capabilities
Absorptive capacity theory, initially proposed by Cohen and Levinthal (1990), focuses on how
organizations acquire, assimilate, and apply external knowledge to enhance their innovation
capabilities and performance. Recent research in absorptive capacity theory has explored various
dimensions, including the microfoundations of absorptive capacity, the role of collaboration and
knowledge networks, and the influence of contextual factors on absorptive capacity. One
examining the individual-level processes and organizational routines that facilitate knowledge
absorption and utilization. For instance, studies by Zahra and George (2002) and Lane et al.
(2016) investigate how factors such as prior knowledge, cognitive diversity, and learning
mechanisms influence an organization's absorptive capacity. These studies shed light on the
cognitive processes and behaviors that underpin the effective absorption and integration of new
knowledge.
Furthermore, research has explored the role of collaboration and knowledge networks in
enhancing absorptive capacity. Lane et al. (2020) highlight the importance of interorganizational
collaboration and knowledge exchange in facilitating knowledge flows and expanding absorptive
capacity across organizational boundaries. Similarly, studies by Ahuja and Katila (2004) and
Lavie and Rosenkopf (2006) examine the role of strategic alliances and network ties in
augmenting absorptive capacity and innovation performance. Moreover, recent research has
examined the influence of contextual factors, such as industry dynamics, organizational culture,
and technological complexity, on absorptive capacity. For example, Laursen and Salter (2014)
explore how industry characteristics, such as technological turbulence and market competition,
affect firms' absorptive capacity-building strategies. Additionally, studies by Van Wijk et al.
(2019) and Zahra et al. (2014) investigate the role of organizational culture and leadership in
fostering a conducive environment for absorptive capacity development. Overall, recent research
microfoundations of absorptive capacity, the role of collaboration and knowledge networks, and
the influence of contextual factors, scholars advance knowledge on how firms can enhance their
The theory of production and cost is a fundamental concept in economics that explores how
firms produce goods and services efficiently and the associated costs involved in the production
process. At its core, this theory examines the relationship between inputs (such as labor and
capital) and outputs (goods and services) to determine the most cost-effective methods of
production. Recent research in this area has contributed to a deeper understanding of production
functions, cost structures, and the factors influencing firms' production decisions. One key aspect
of the theory of production is the concept of production functions, which describe the
relationship between inputs and outputs. Recent studies have advanced our understanding of
scale, and resource constraints. For example, research by Acemoglu and Autor (2020) explores
how advancements in technology affect the shape of production functions and influence firms'
productivity levels. By incorporating insights from fields such as machine learning and artificial
intelligence, this research sheds light on the evolving nature of production processes in the
digital age.
Furthermore, the theory of production and cost examines various cost structures that firms face
in the production process, including fixed costs, variable costs, and total costs. Recent research
has investigated how changes in input prices, technological innovation, and market conditions
impact firms' cost structures and profitability. For instance, studies by Hall et al. (2019) and
Brynjolfsson and McAfee (2021) examine the role of digital technologies in reducing production
costs and increasing efficiency in various industries. These studies highlight the transformative
impact of technological innovation on firms' cost structures and competitive dynamics. Another
area of research within the theory of production and cost focuses on factors influencing firms'
production decisions, such as economies of scale, factor substitution, and input prices. Recent
studies have explored how firms optimize production processes to minimize costs while
maximizing output levels. For example, research by Aghion et al. (2017) investigates the
relationship between competition, innovation, and productivity growth, highlighting how firms'
production decisions are shaped by market dynamics and competitive pressures. Similarly,
studies by Roberts and Tybout (2019) and Melitz and Redding (2020) examine how firms adjust
their production strategies in response to changes in trade policies and globalization trends.
Ambidexterity theory
exploration and exploitation activities. Exploration involves seeking out new opportunities,
theory has expanded our understanding of how organizations achieve this balance, the
organizational mechanisms facilitating ambidexterity, and the implications for firm performance
and competitiveness. One key area of research within ambidexterity theory examines the
organizational mechanisms and practices that enable firms to simultaneously pursue exploration
and exploitation. Studies have identified various structural, cultural, and strategic approaches that
facilitate ambidexterity. For example, research by O'Reilly and Tushman (2019) highlights the
role of organizational design, such as the use of separate units or teams dedicated to exploration
and exploitation, in fostering ambidexterity. Similarly, studies by Gibson and Birkinshaw (2020)
and Gupta and Wang (2021) explore how leadership styles, culture, and incentives can promote
Furthermore, recent research has investigated the antecedents and consequences of ambidexterity
at the individual, team, and organizational levels. For instance, studies by Raisch and Birkinshaw
(2018) and Lubatkin et al. (2020) examine how individual and team-level factors, such as
cognitive diversity, task allocation, and learning orientation, influence ambidextrous behaviors.
These studies contribute to a deeper understanding of the personal and interpersonal dynamics
that underpin ambidexterity within organizations. Another area of research within ambidexterity
theory explores the impact of ambidexterity on firm performance and competitiveness. While
early studies suggested a positive relationship between ambidexterity and performance, recent
research has provided more nuanced insights into the conditions under which ambidexterity leads
to superior outcomes. For example, research by Jansen et al. (2019) and He and Wong (2020)
examines the moderating effects of environmental dynamism, industry turbulence, and strategic
research has explored the role of ambidexterity in facilitating organizational adaptation and
resilience in turbulent environments. Studies by Teece (2018) and Doz and Kosonen (2021)
disruptions, market shifts, and competitive threats. By balancing exploration and exploitation,
ambidextrous organizations can navigate uncertainty more effectively and sustain long-term
success. Overall, recent research in ambidexterity theory has deepened our understanding of how
organizations balance exploration and exploitation to achieve competitive advantage and adapt to
The diffusion of innovation theory, proposed by Everett Rogers in 1962, explores how new
ideas, products, and technologies spread within a social system over time. This theory identifies
different categories of adopters, including innovators, early adopters, early majority, late
majority, and laggards, each characterized by their readiness to embrace new innovations. Recent
research in the field of diffusion of innovation theory has expanded our understanding of the
factors influencing the adoption and diffusion process, the role of communication channels and
networks, and the implications for organizational change and societal impact. One significant
area of research within diffusion of innovation theory focuses on the factors influencing the
adoption and diffusion of innovations. Recent studies have identified various individual,
organizational, and contextual factors that shape the adoption decision. For example, research by
Venkatesh et al. (2019) and Rogers and Singhal (2021) explores the role of perceived usefulness,
ease of use, compatibility, and complexity in determining individuals' willingness to adopt new
technologies. Additionally, studies by Damanpour (2014) and Agarwal et al. (2020) investigate
how organizational characteristics, such as size, structure, and culture, influence the adoption
Furthermore, recent research has examined the role of communication channels and networks in
facilitating the diffusion of innovations. Studies have highlighted the importance of interpersonal
perceptions, and influencing adoption decisions. For instance, research by Valente (2020) and
Granovetter (2018) examines the role of opinion leaders, social contagion, and word-of-mouth
understanding of the mechanisms through which information flows within social systems and
drives the diffusion of innovations. Another area of research within diffusion of innovation
theory explores the implications for organizational change and innovation management. Recent
studies have examined how firms can leverage knowledge management, organizational learning,
and change management practices to facilitate the adoption and implementation of innovations.
For example, research by Tushman and O'Reilly (2019) and Crossan et al. (2021) explores the
adoption and adaptation within organizations. These studies provide insights into the
In addition, recent research has investigated the broader societal impact of innovation diffusion,
including its implications for economic development, public policy, and social change. Studies
by Mizruchi and Fein (2020) and Muro et al. (2018) examine how innovation diffusion
regulatory frameworks, and social norms in shaping the diffusion process and ensuring equitable
access to innovations. Overall, recent research in diffusion of innovation theory has deepened
our understanding of how innovations spread within social systems and the factors influencing
the adoption process. By investigating the individual, organizational, and societal factors driving
diffusion, scholars continue to advance knowledge in this critical area of innovation studies.
Method of Estimation
estimate the parameters of a probability distribution that best explain the observed data. It is
widely employed in various fields, including econometrics, biostatistics, and machine learning,
due to its robustness and efficiency in parameter estimation (Greene, 2012). The MLE method
seeks to find the values of the model parameters that maximize the likelihood function, which
measures the probability of observing the data given the parameter values.
To understand the MLE method, consider a simple example of fitting a normal distribution to a
set of observed data points. The likelihood function in this case would represent the probability
of observing the data points under the assumption that they are sampled from a normal
distribution with unknown mean and variance. The MLE procedure involves finding the values
of the mean and variance that maximize this likelihood function, typically by taking the
derivative of the likelihood function with respect to the parameters and setting it to zero.
One of the key advantages of the MLE method is its consistency, meaning that as the sample size
increases, the estimated parameters converge to the true population parameters (Casella &
Berger, 2002). This property makes MLE particularly useful for large datasets where other
estimation methods may be computationally intensive or less reliable. Additionally, MLE
provides asymptotically efficient estimates, meaning that they achieve the smallest possible
variance among all consistent estimators as the sample size approaches infinity.
In practice, the MLE method is implemented using optimization algorithms such as gradient
descent or the Newton-Raphson method to numerically maximize the likelihood function. These
algorithms iteratively update the parameter estimates until convergence is achieved. While MLE
is generally robust and efficient, it may suffer from bias or inefficiency in small sample sizes or
Overall, the Maximum Likelihood Estimation method is a powerful tool for estimating model
parameters from observed data, offering consistency, efficiency, and asymptotic properties. Its
versatility and applicability across different domains make it a valuable tool for researchers and
analyze the complex relationships between observed and latent variables in a structural
framework. It allows researchers to test and validate theoretical models by assessing the direct
and indirect effects of variables on one another. SEM integrates factor analysis and multiple
(Kline, 2015).
One of the key advantages of SEM is its ability to accommodate measurement error by
distinguishing between observed and latent variables. Latent variables, also known as constructs
or factors, are not directly observed but inferred from observed indicators. SEM enables
researchers to estimate the relationships between latent variables and assess their impact on the
observed variables, thereby reducing measurement error and enhancing the accuracy of the
SEM allows for the estimation of both confirmatory and exploratory models. Confirmatory
factor analysis (CFA) is commonly used in SEM to test pre-specified hypotheses and assess the
fit of the proposed model to the data. Researchers specify a theoretical model based on existing
literature or theoretical frameworks and evaluate whether the observed data fit the hypothesized
model. In contrast, exploratory factor analysis (EFA) is used to identify underlying patterns or
dimensions in the data when there is limited prior knowledge about the relationships among
SEM provides various fit indices to evaluate the goodness of fit between the proposed model and
the observed data. These fit indices include the Comparative Fit Index (CFI), Tucker-Lewis
Index (TLI), Root Mean Square Error of Approximation (RMSEA), and Standardized Root
Mean Square Residual (SRMR). A good-fitting model typically exhibits CFI and TLI values
above 0.95, RMSEA values below 0.06, and SRMR values below 0.08 (Hu & Bentler, 1999).
Researchers can use SEM to test complex theoretical models with multiple pathways and
mediating effects. By examining the direct and indirect relationships between variables, SEM
enables researchers to uncover underlying mechanisms and pathways through which variables
influence one another. This allows for a more nuanced understanding of the phenomena under
investigation and provides valuable insights for theory development and practical applications
(Bentler, 2005).
Ordinary Least Squares (OLS) is a widely used method in econometrics and statistical analysis
for estimating the parameters of a linear regression model. It aims to minimize the sum of
squared differences between the observed values of the dependent variable and the values
predicted by the linear regression equation. OLS assumes that the errors or residuals of the model
are normally distributed with a mean of zero and constant variance, making it an efficient and
One of the key advantages of OLS is its simplicity and ease of interpretation. The OLS estimator
researchers to assess the strength and direction of the relationship between the independent and
dependent variables (Kennedy, 2008). Moreover, OLS regression results can be easily
interpreted in terms of marginal effects, which represent the change in the dependent variable for
a one-unit change in the independent variable, holding other variables constant (Wooldridge,
2015).
OLS regression allows researchers to test hypotheses about the relationships between variables
and make predictions about the values of the dependent variable based on the values of the
and F-statistic to assess the overall fit of the model and the significance of the independent
variables (Gujarati & Porter, 2009). These measures help researchers evaluate the explanatory
power of the regression model and determine whether the model adequately captures the
Despite its widespread use, OLS regression has certain limitations and assumptions that
researchers should be aware of. OLS assumes that the independent variables are linearly related
to the dependent variable and that the errors are homoscedastic and normally distributed.
Violations of these assumptions can lead to biased and inefficient parameter estimates (Stock &
Watson, 2015). Additionally, OLS may not be appropriate for data with outliers or influential
observations, as it can disproportionately affect the estimated coefficients and the overall fit of
the model.
In summary, OLS regression is a valuable tool for estimating the parameters of linear regression
models and analyzing the relationships between variables. It provides simple and interpretable
the factors influencing the dependent variable. However, researchers should exercise caution and
assess the assumptions of the OLS model to ensure the validity and reliability of their results.
Partial Least Squares (PLS) is a statistical technique used for modeling relationships
equation modeling (SEM) when dealing with complex models involving multiple dependent and
independent variables (Hair et al., 2019). PLS regression estimates the latent variables' scores
and loadings simultaneously, making it suitable for both exploratory and confirmatory analysis
PLS is characterized by its ability to handle small sample sizes, non-normal data distributions,
and complex models with collinear or high-dimensional data (Chin, 1998). Unlike traditional
regression methods, PLS does not require strict assumptions about the data distribution, making
it more robust and flexible in various research contexts (Hair et al., 2019). As a result, PLS is
widely used in fields such as marketing, management, and social sciences to analyze structural
of multicollinearity or when dealing with formative constructs (Chin, 2010). PLS prioritizes
prediction accuracy and model parsimony, making it suitable for situations where the focus is on
explaining variance and making accurate predictions rather than testing specific hypotheses
(Henseler et al., 2016). Additionally, PLS allows for the incorporation of both reflective and
formative measurement models, providing greater flexibility in model specification (Hair et al.,
2019).
Despite its strengths, PLS has certain limitations that researchers should consider. It may
produce less efficient parameter estimates compared to other estimation methods such as
maximum likelihood estimation (MLE) in SEM (Hair et al., 2019). Additionally, PLS may not
perform well in situations where the data do not meet the assumptions of the method, such as
cases of severe non-normality or small sample sizes (Chin, 1998). Researchers should carefully
evaluate the appropriateness of PLS for their specific research context and consider alternative
methods if necessary.
In summary, Partial Least Squares (PLS) is a versatile and powerful statistical technique for
analyzing complex structural models with latent constructs. Its flexibility, robustness, and
predictive capabilities make it well-suited for research in various fields, particularly when
dealing with small sample sizes, non-normal data, or high levels of multicollinearity. By
understanding the strengths and limitations of PLS, researchers can effectively apply this method
to address research questions and test theoretical models in their respective domains.
Confirmatory Factor Analysis (CFA) is a statistical technique used to test the validity
of a measurement model by assessing the extent to which observed variables (indicators) reflect
underlying latent constructs (factors) (Brown, 2015). It is commonly employed in the field of
psychology, social sciences, and business research to evaluate the reliability and validity of
researchers to confirm or refute the hypothesized structure of latent variables and examine the
One of the primary objectives of CFA is to assess the fit between the proposed measurement
model and the observed data. Fit indices such as the Comparative Fit Index (CFI), Tucker-Lewis
Index (TLI), and Root Mean Square Error of Approximation (RMSEA) are used to evaluate the
goodness-of-fit of the model (Hu & Bentler, 1999). A well-fitting model indicates that the
observed variables adequately represent the underlying constructs, providing evidence for the
CFA also enables researchers to assess the convergent and discriminant validity of the
measurement model. Convergent validity is evaluated by examining the factor loadings of the
observed variables, with higher loadings indicating stronger relationships with the underlying
construct (Brown, 2015). Additionally, the Average Variance Extracted (AVE) is used to assess
the amount of variance captured by the latent variables relative to measurement error (Fornell &
Larcker, 1981). Discriminant validity, on the other hand, is assessed by comparing the AVE of
each construct with the squared correlations between constructs, ensuring that constructs are
Moreover, CFA allows researchers to investigate the reliability of the measurement model by
examining the internal consistency of the observed variables within each latent construct.
Cronbach's alpha coefficient is commonly used to assess the reliability of the measurement
scales, with values above 0.70 generally considered acceptable (Nunnally & Bernstein, 1994).
Additionally, composite reliability (CR) can be computed to assess the internal consistency of
the measurement model based on the standardized factor loadings of the indicators (Hair et al.,
2019). A reliable measurement model ensures that the observed variables consistently measure
Overall, Confirmatory Factor Analysis (CFA) is a valuable tool for evaluating the validity and
measurement model to the observed data, as well as the convergent and discriminant validity of
the constructs, researchers can ensure that their measurement instruments accurately capture the
underlying concepts of interest. Additionally, CFA provides insights into the reliability of the
Conceptual Framework
Independent variables
Dependent variable
Innovation
Organizational
Product process Performance
Product development Financial
profitability
Product size
Sales volume
Product quality
Control variables
Organization size
Organization culture
The conceptual model presented depicts a comprehensive framework for understanding the
that contribute to how innovation impacts an organization, drawing upon insights from
innovation theory. The model outlines both independent and dependent variables of innovation,
as well as control variables that may moderate this relationship. The model delineates several
independent variables believed to exert influence on innovation within organizations. First and
services as well as the development of more efficient production methods. Studies by Lee et al.,
(2021) suggest that process innovation can significantly enhance product quality and reduce
production times, ultimately leading to cost savings and a competitive edge. Another critical
variable is product development, which refers to the entire process of bringing a new offering
Cavanagh, Cavanagh, and Farley (2020), can expedite innovation and lead to faster market entry,
potentially boosting profitability. Additionally, the scale and intricacy of a product, known as
product complexity, can influence how an organization approaches innovation. Highly complex
products might necessitate substantial R&D investments and longer development cycles, while
simpler products might allow for quicker innovation iterations (Lin 2018). Lastly, a focus on
superior product quality, as highlighted by Lin (2018), serves as a driver for innovation.
Companies continuously strive to develop products with improved features and functionalities to
The model further identifies three key dependent variables that are potentially impacted by
innovation. Financial profitability stands out as a primary outcome, driven by the quest for
increased financial returns. Developing new products or services that command premium prices
or streamline production costs can directly contribute to a company's profitability (Lee et al.,
increase in workforce size, a broader product portfolio, or expanded market reach. Additionally,
organizational culture plays a pivotal role in innovation. A culture that fosters creativity, risk-
taking, and collaboration is crucial for successful innovation. Conversely, rigid hierarchies or
The model also acknowledges the role of control variables that can influence the relationship
between innovation and performance. These variables, such as organization size and culture, can
affect how effectively an organization implements its innovation initiatives. Recent academic
sources provide empirical support for the conceptual framework outlined in the model. Studies
by Lee et al., (2021), Cavanagh, Cavanagh, and Farley (2020), and Lin (2018) offer insights into
the impact of various independent variables on innovation and its outcomes, thereby enriching
3.1 Preamble
This chapter presents the methodologies that would be employed in the study. It covers the
A research design is the overall plan or strategy that a researcher develops to guide the process of
collecting, analyzing, and interpreting data in a mixed methods study. Research design outlines
how both qualitative and quantitative components of a study will be integrated to address the
research question or objective effectively (Creswell& Plano Clark, 2017). It helps to determine
the method of data collection and analysis as well as how well as how these will answer the
research questions (Grey, 2014). Data is gathered cross sectional while Correlation study is to be
used in this study to analyze the relationships between work life balance, reward management
The target population for this study is the skilled labour of Seven-Up Bottling Company Plc,
Ibadan, and Nigeria Brewery Plc, Ibadan. The choice of skilled labour is because of the
objectives the study seeks to achieve and because of the ability of the skilled labour to easily
understand the contents of the questionnaire. Also the study firm was chosen because of the
The total number of skilled labour in Nigeria Brewery Plc, Ibadan is Five Hundred and Twenty-
three (523) (Nigeria Brewery annual report, 2018) while that of Seven-Up Bottling Company
Plc, Ibadan is Two Hundred and Thirty-two (232) as at May 2018 (Seven-Up Bottling annual
report, 2018). The total population for the study is seven hundred and fifty-five (755). This
study targeted the skilled labour in Seven-Up Bottling Company Plc, Ibadan, and Nigeria
Brewery Plc, Ibadan. The Sample size for the study was calculated by using Taro Yamane
(Yamane, 1973) formula with a 95% confidence level. The sample size is 262 samples selected
from the population. A simple random sampling technique was further used to distribute the
questionnaires to ensure that the sampling unit of the population has an equal chance of being
selected. The questionnaires were distributed to Nigeria Brewery Plc, Ibadan given one hundred
and fifty-seven (157) which represents sixty (60%) of the questionnaire to be administered to the
skilled staff while one hundred and five (105) which represent forty (40%) to be administered to
the skilled staffs of Seven-Up Bottling Company Plc, Ibadan. Nigeria Brewery Plc, Ibadan will
be given sixty percent of the questionnaires because the numbers of skilled staff they have are
more than doubled the staff of Seven-Up Bottling Company Plc, Ibadan.
The primary data sources are the firm employees. Sample size is drawn from the 257 employees
The study utilized a structured questionnaire to gather data from the study population using a
Likert scale measurement (Strongly agree, Agree, Neutral, Disagree, and Strongly disagree). The
questionnaire will be divided into four parts. The first part contained respondent’s demographic
product, organization performance. The consent of each respondent to the questionnaire will be
soughed and gotten through an endorsed consent form before the questionnaires were
administered.
For a research instrument to be approved as valid, it has to measure what it is meant to measure
by asking appropriate research questions that can enable the findings to go along specified
research objectives. Using the face validity method, the researcher requested an expert (the
supervisor) in the field to assure that the items measured what they were intended to measure.
Secondly the researcher made use of the content validity which focused on the conceptualization
and the operationalization to ensure that all the concepts were covered.
A test instrument is said to be reliable when measurements are taken with the same instrument
over and over again and obtain results that are consistently similar with variation that may be
counted as insignificance. The research ensured the reliability of the questionnaire to determine
its consistency in testing what it was intended to measure. A reliability of the instrument used for
this study was done by giving the same set of questionnaires to the same set of respondents at
different point in time. The first sets of questionnaires were administered to 10 respondents at
The collected data will be transferred to Microsoft Excel for the purpose of data cleansing and
preparation in anticipation of analysis. The study's dataset will be subjected to two distinct forms
of analysis. Firstly, a descriptive statistical analysis will be conducted. This form of analysis
serves to precisely delineate, visually present, and succinctly summarize data points, enabling the
identification of patterns that cater comprehensively to the data's requisites, including the
aimed at addressing and testing the hypotheses formulated in the preceding chapter. For the data
analysis phase, SPSS-Statistics software will be employed. Utilizing ANOVA and the
independent sample t-test, the data will be examined to ascertain whether statistically significant
distinctions exist. Furthermore, descriptive statistics will be computed with respect to the