0% found this document useful (0 votes)
12 views7 pages

Bus. Research

The document discusses data collection methods in business research, categorizing them into primary and secondary data collection methods. It details various techniques for gathering primary data, such as quantitative and qualitative methods, and outlines the importance of data analysis in deriving insights from collected data. Additionally, it explains hypothesis testing, including null and alternate hypotheses, and differentiates between parametric and non-parametric tests based on their assumptions about data distribution.

Uploaded by

gourabpagal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views7 pages

Bus. Research

The document discusses data collection methods in business research, categorizing them into primary and secondary data collection methods. It details various techniques for gathering primary data, such as quantitative and qualitative methods, and outlines the importance of data analysis in deriving insights from collected data. Additionally, it explains hypothesis testing, including null and alternate hypotheses, and differentiates between parametric and non-parametric tests based on their assumptions about data distribution.

Uploaded by

gourabpagal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Business Research

Data Collection :-
In Statistics, data collection is a process of gathering information from all the
relevant sources to find a solution to the research problem. It helps to
evaluate the outcome of the problem. The data collection methods allow a
person to conclude an answer to the relevant question. Most of the
organizations use data collection methods to make assumptions about future
probabilities and trends. Once the data is collected, it is necessary to undergo
the data organization process.

The main sources of the data collections methods are “Data”. Data can be
classified into two types, namely primary data and secondary data. The
primary importance of data collection in any research or business process is
that it helps to determine many important things about the company,
particularly the performance. So, the data collection process plays an
important role in all the streams.

• Primary Data Collection methods


• Secondary Data Collection methods

Primary Data Collection Methods


Primary data or raw data is a type of information that is obtained directly from
the first-hand source through experiments, surveys or observations. The
primary data collection method is further classified into two types. They are
Quantitative Data Collection Methods
Qualitative Data Collection Methods

Let us discuss the different methods performed to collect the data under
these two data collection methods.
Quantitative Data Collection Methods
It is based on mathematical calculations using various formats like close-
ended questions, correlation and regression methods, mean, median or mode
measures. This method is cheaper than qualitative data collection methods
and it can be applied in a short duration of time.
Qualitative Data Collection Methods

It does not involve any mathematical calculations. This method is closely


associated with elements that are not quantifiable. This qualitative data
collection method includes interviews, questionnaires, observations, case
studies, etc. There are several methods to collect this type of data. They are
Observation Method
Observation method is used when the study relates to behavioural science.
This method is planned systematically. It is subject to many controls and
checks. The different types of observations are:
Structured and unstructured observation
Controlled and uncontrolled observation
Participant, non-participant and disguised observation

Interview Method
The method of collecting data in terms of verbal responses. It is achieved in
two ways, such as
Personal Interview – In this method, a person known as an interviewer is
required to ask questions face to face to the other person. The personal
interview can be structured or unstructured, direct investigation, focused
conversation, etc.
Telephonic Interview – In this method, an interviewer obtains information by
contacting people on the telephone to ask the questions or views, verbally.
Questionnaire Method
In this method, the set of questions are mailed to the respondent. They should
read, reply and subsequently return the questionnaire. The questions are
printed in the definite order on the form. A good survey should have the
following features:

• Short and simple


• Should follow a logical sequence
• Provide adequate space for answers
• Avoid technical terms
• Should have good physical appearance such as colour, quality of the
paper to attract the attention of the respondent
Secondary Data Collection Methods
Secondary data is data collected by someone other than the actual user. It
means that the information is already available, and someone analyses it. The
secondary data includes magazines, newspapers, books, journals, etc. It may
be either published data or unpublished data.

• Published data are available in various resources including


• Government publications
• Public records
• Historical and statistical documents
• Business documents
• Technical and trade journals
• Unpublished data includes
• Diaries
• Letters
• Unpublished biographies, etc.

Data Analysis:-
Data analysis, the process of systematically collecting, cleaning,
transforming, describing, modeling, and interpreting data, generally
employing statistical techniques. Data analysis is an important part of both
scientific research and business, where demand has grown in recent years for
data-driven decision making. Data analysis techniques are used to gain useful
insights from datasets, which can then be used to make operational decisions
or guide future research. With the rise of “Big Data,” the storage of vast
quantities of data in large databases and data warehouses, there is increasing
need to apply data analysis techniques to generate insights about volumes of
data too large to be manipulated by instruments of low information-
processing capacity.
Types :-

Descriptive Analysis
Descriptive analysis is used to summarize and describe the main features of a
dataset. It involves calculating measures of central tendency and dispersion
to describe the data. The descriptive analysis provides a comprehensive
overview of the data and insights into its properties and structure.

Inferential Analysis
The inferential analysis is used statistical analysis plan and testing to make
inferences about the population parameters, such as the mean or proportion.
This unit of analysis involves using models and hypothesis testing to make
predictions and draw conclusions about the population.
Predictive Analysis
Predictive analysis is used to predict future events or outcomes based on
historical data and other relevant information. It involves using statistical
models and machine learning algorithms to identify patterns in the data and
make predictions about future outcomes.

Prescriptive Analysis
Prescriptive analysis is a decision-making analysis that uses mathematical
modeling, optimization algorithms, and other data-driven techniques to
identify the action for a given problem or situation. It combines mathematical
models, data, and business constraints to find the best move or decision.

Text Analysis
Text analysis is a process of extracting meaningful information from
unstructured text data. It involves a variety of techniques, including natural
language processing (NLP), text mining, sentiment analysis, and topic
modeling, to uncover insights and patterns in text data.
Diagnostic Analysis
The diagnostic analysis seeks to identify the root causes of specific events or
outcomes. It is often used in troubleshooting problems or investigating
anomalies in data.
Hypothesis Testing:-
In today’s data-driven world, decisions are based on data all the time.
Hypothesis plays a crucial role in that process, whether it may be making
business decisions, in the health sector, academia, or in quality improvement.
Without hypothesis & hypothesis tests, you risk drawing the wrong
conclusions and making bad decisions. In this tutorial, you will look at
Hypothesis Testing in Statistics.
Null Hypothesis and Alternate Hypothesis

The Null Hypothesis is the assumption that the event will not occur. A null
hypothesis has no bearing on the study’s outcome unless it is rejected.
H0 is the symbol for it, and it is pronounced H-naught.
The Alternate Hypothesis is the logical opposite of the null hypothesis. The
acceptance of the alternative hypothesis follows the rejection of the null
hypothesis. H1 is the symbol for it.

Let’s understand this with an example.


A sanitizer manufacturer claims that its product kills 95 percent of germs on
average.
To put this company’s claim to the test, create a null and alternate hypothesis.
H0 (Null Hypothesis): Average = 95%.

Alternative Hypothesis (H1): The average is less than 95%.


Parametric and Non Parametric Test:-
Parametric Test:-
A parametric test is a statistical test which makes certain assumptions about
the distribution of the unknown parameter of interest and thus the test
statistic is valid under these assumptions. A significance test under a Simple
Normal Model for example has the assumption that the parameter has a
normal distribution, behaves like an independent variable (is the result of an
independent process) is identically distributed and has a constant mean and
variance. Therefore, an integral part of applying such a test is making sure it is
adequate vis-à-vis the observed data. This process is called mis-specification
testing.
Modern parametric statistical inference is dominated by the concept of the
likelihood function: the specified statistical model fully determines the
likelihood function and all estimators rely on that quality one way or another,
including the notion of a uniformly most powerful test which is a cornerstone
in power analysis.
Parametric tests have the benefit of being precise in their assumptions which
leads to more precise inferences and the ability to run mis-specification tests.
Non parametric Test:-
Non-parametric tests are experiments that do not require the underlying
population for assumptions. It does not rely on any data referring to any
particular parametric group of probability distributions. Non-parametric
methods are also called distribution-free tests since they do not have any
underlying population.
Reasons to Use Non-Parametric Tests

It is important to access when to apply parametric and non-parametric tests


in order to arrive at the correct statistical inference. The reasons to use a non-
parametric test are given below:
When the distribution is skewed, a non-parametric test is used. For skewed
distributions, the mean is not the best measure of central tendency, hence,
parametric tests cannot be used.

If the size of the data is too small then validating the distribution of the data
becomes difficult. Thus, in such cases, a non-parametric test is used to
analyze the data.
If the data is nominal or ordinal, a non-parametric test is used. This is because
a parametric test can only be used for continuous data.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy