0% found this document useful (0 votes)
23 views12 pages

Abn 2102

. ...

Uploaded by

a7022624124
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views12 pages

Abn 2102

. ...

Uploaded by

a7022624124
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

NAME – AKANSHA BABURAO

NEBAPURE
ROLL NUMBER – 2314107681
PROGRAM - BACHELOR OF
BUSINESS ADMINISTRATION (BBA)
SEMESTER – III
COURSE NAME - QUANTITATIVE
TECHNIQUES FOR MANAGEMENT
COURSE CODE- DBB2102
ANS 1

A. Primary Data Sources


Original and gathered straight from first-hand experiences is primary data. Common sources
include:

Questionnaires and surveys are structured instruments that are used to collect data on attitudes,
behaviors, and other traits from a large number of respondents.
Interviews: Direct, unstructured, semi-structured, or one-on-one talks with a group are used to get
in-depth understanding.
Observations: Keeping track of actions or occurrences while they take place in the wild.
Experiments: Performing trials or controlled tests to ascertain causal links.
Focus groups are facilitated conversations with the goal of obtaining various viewpoints on a
certain subject.
Field trials: Putting goods or services to the test in actual use to learn more about their functionality
and user experiences.

Secondary Data Sources


Information that has been previously gathered for a different purpose is known as secondary data.
Common sources include:

Books and Journals: Published works offering conclusions from both theoretical and empirical
study.
Publications and Reports from Government Agencies: Information gathered and shared by
government organizations, including health statistics, economic reports, and census data.
Industry reports: Evaluations and insights written by professionals in the field or by market
research companies.
Online databases: Digital repositories that hold a variety of data and research publications, such
as JSTOR, PubMed, or company databases.
Media sources include periodicals, newspapers, and internet news sites that cover current affairs
and fashions.
Internal Company Records: Information gathered from within a company, including internal
reports, financial accounts, and sales records.

B. Characteristics of a Good Questionnaire


A good questionnaire is made to quickly and effectively collect pertinent and reliable data.
Important traits consist of:
-Clarity and Simplicity: To ensure that all respondents can easily comprehend the questions, they
should be succinct, unambiguous, and devoid of technical terms.
-Relevance: All inquiries must be directly related to the goals of the research; superfluous or
irrelevant inquiries should be avoided.
-Brevity: In order to avoid respondent weariness, the questionnaire should be as brief as feasible
while still obtaining the required data.
-Logical Flow: Questions ought to be arranged rationally, frequently beginning with simpler, less
delicate subjects and working up to more difficult or delicate ones.
-Neutrality: Questions ought to be objective and should not influence respondents' responses in
any way.
-Diversity in Question Types: To collect data that is both quantitative and qualitative, a
combination of closed- and open-ended questions should be used.
-Pilot Testing: To find and fix any problems, the questionnaire should be tested on a small sample
of people before being fully deployed.
-Anonymity and Confidentiality: You can boost response rates and honesty by assuring
respondents that their responses will be kept private.
-Well-Composed Answer Choices: In multiple-choice questions, offer a well-composed selection
of answers that encompass all possible answers.
-Instructions: To guarantee that responders understand how to accurately answer the questions,
clear instructions should be given for each component.

ANS 2

A.Mean = ∑(X.f)

∑f

Here, XXX represents the data values, and fff represents the frequencies of those values.

Given the data:

X f
2 1
4 4
6 6
8 4
10 1

Calculating X.f for each data value


X f X.f
2 1 2
4 4 16
6 6 36
8 4 32
10 1 10

Now, we sum the product of X.f and f

∑(X.f) = 2+16+36+32+10 = 96

∑f = 1+4+6+4+1 = 16

Mean = 96/16

=6

Therefore, mean of the given data is 6.

B. A well-defined measure of dispersion offers a thorough comprehension of the variability or


spread of a dataset. Qualities of a good dispersion measure include:

1. Simple and Easy to Understand


An effective dispersion measure should be simple to compute and understand so that users can
quickly determine how variable the data is.

2. Based on All Data Points


To give a comprehensive view of the variability of the data, it should consider every observation
in the dataset. This guarantees that the measure is not disproportionately influenced by any one
data point.

3. Mathematically Rigid
To ensure accuracy and dependability, the measure must have a precise mathematical definition
and yield consistent findings under comparable circumstances.

4. Absolute Measure
Although relative measurements such as the coefficient of variation are useful, an absolute
measure of dispersion such as the standard deviation gives a precise number that represents the
extent of data spread.

5. Sensitive to Extreme Values


An effective dispersion metric should illustrate the influence of extreme or outlier data and the
degree to which they depart from the mean.

6. Simple to Calculate
The computing procedure ought to be effective, devoid of intricate computations that can result in
mistakes or be challenging to carry out by hand.

7. Dimensionally Consistent
To facilitate simpler interpretation in relation to the original data, the measure ought to have the
same units as the data.

8. Appropriate for the Data Type


Different dispersion metrics work well with different kinds of data (e.g., standard deviation for
regularly distributed data, range for small datasets). The type and distribution of the data should
be taken into account while selecting a measure.

9. Useful for Further Statistical Analysis


A reliable dispersion measure should be useful in more statistical analyses, like confidence
interval estimates and hypothesis testing, to increase its applicability in a wider range of
situations.

ANS 3 A.

B. A strong statistical method for analyzing the relationship between two or more variables is
regression analysis. It facilitates understanding how, while the other independent variables are
kept constant, the dependent variable varies in response to changes in any one of the independent
variables.

Uses of Regression Analysis


-Regression models are used in predictive analysis to forecast future values by using historical
data. For instance, forecasting the weather, stock prices, or sales.
-Trend analysis is the process of locating and measuring long-term patterns in data, such as patterns
in consumer behavior or economic growth.
-Establishing Relationships: Knowing the type and strength of relationships between variables,
such as how advertising expenditure affects sales.
-Forecasting is the process of estimating future events; supply chain management uses demand
forecasting as an example.
Optimization: Identifying the optimal level of inputs to achieve desired outputs, useful in fields
like operations research and resource management.

Examples:
1. Marketing

Advertising Effectiveness: Determining the most effective medium by examining the


effects of various advertising channels on sales.
Customer Retention: Making predictions about potential attrition from past purchases,
customer reviews, and engagement metrics.

Finance:

Stock Market Analysis: Predicting stock prices based on economic indicators, company
performance, and historical price data.
Risk management is the process of evaluating how different financial risks will affect investment
portfolios and creating plans to reduce those risks.
2.
Health Care:

Patient Outcomes: Forecasting the results of a patient's medical history, type of treatment,
and demographics.
Disease Outbreaks: Predicting how infectious diseases will spread by taking into account
biological, social, and environmental factors.
3.
Human Resources:

Employee Performance: Examining elements like experience, training, and workplace


culture that affect workers' output and performance.
Salary Predictions: Estimating the right pay ranges based on market trends, education,
experience, and job roles.
4.
Supply Chain Management and Operations:

Inventory management is the process of estimating product demand in order to maximize


stock levels and minimize holding expenses.
Analyzing the connection between supply chain factors (such as supplier dependability
and transportation expenses) and the overall effectiveness of the supply chain is known
as supply chain optimization.

ANS 4

Methods of secular trends


1. Graphical or freehand Technique
The general direction of the trend is determined by visually fitting a trend line through the points
on a graph created from the data.
Advantages: Simple and easy to apply. gives a
graphic depiction of the trend.

Disadvantages: Subjective and depends on the analyst's judgment.


Unsuitable for in-depth examination.
Use Case: Good for exploratory analysis or when there are few data points and a rapid visual
understanding is needed.

2. Semi-Average Method Description: Using this method, the data is divided into two equal parts.
The average for each part is then determined, and a trend line is plotted on the graph using these
averages.

Steps:

Separate the data into two equal halves (starting with excluding the middle point if the number of
data points is odd).
Determine the mean for every component.
Draw a graph with these averages on it.
Connect these points with a straight line that spans the entire data range.

3. Moving Average Method Description: By averaging data points over a predetermined window
(such as three or five years), this method reduces short-term fluctuations. Then, these moving
averages are plotted to form the trend line.

Steps:

Select a window size (time period) for the moving average.


Take the average of the data points in each window to determine the moving average. Plot the
moving averages in relation to the relevant time intervals.

4. Least Squares Method

Description: This statistical method fits a trend line to the data by minimizing the sum of the
squares of the vertical distances of the points from the line. The trend line can be linear or take
other forms (e.g., quadratic, exponential).

Steps:

1. Formulate the trend equation (e.g., linear Y=a+bX).


2. Calculate the coefficients a and b using the least squares formula.
3. Plot the trend line using the calculated equation.
5. Exponential Smoothing: This technique gives more weight to recent data points by applying
exponentially decreasing weights to earlier observations.

Steps:

Select a smoothing constant between 0 and 1 (alpha, α).


To compute smoothed values iteratively, use the smoothing formula. Advantages:

Excellent for data that exhibits a steady pattern over time. swiftly adapts
to shifts in the trend.

ANS 5
There are a number of difficulties and potential problems when creating index numbers. Economic
variables like prices, quantities, and values can be tracked over time using index numbers.
Although they are effective instruments for economic analysis, the following problems with their
construction can make them problematic:

1. Base Period Selection


Every other period is measured against the base period as a benchmark. Issues consist of:

-Selecting a Fitting Base Period: A normal, stable period devoid of anomalies should serve as the
basis. The index could be deceptive if the base period is not representative.
-Revising the Basis Time: The base period might become out of date over time, requiring regular
updates to stay relevant.

2. Selection of Items
Selecting which items to add to the index can be difficult:

-Representativeness: The chosen items ought to be indicative of the larger category that they are
meant to assess. Biased indices may arise from the omission of important items.
-Modifications to Consumption Trends: As consumer tastes and consumption habits evolve,
maintaining the index's representativeness becomes more difficult.

3. Weight Assignment
It's critical to establish the proper weights for various items:

-Subjectivity-based Subjective judgment is frequently involved in weight assignment. The


weights selected might not fairly represent an item's actual significance within the index as a
whole.
-Dynamic Weights: As items gain or lose importance over time, weights must be adjusted
frequently, which can make index construction more difficult.

4. Price Data Collection


It is crucial to have consistent and accurate pricing data.

-Data Reliability and Availability: It can be challenging to find trustworthy data for every item in
the index. By store, brand, and location, prices may differ.
-Price Adjustments and Discounts: Transient price adjustments, markdowns, and sales may add
volatility to the index, making it more difficult to identify long-term patterns.

5. Quality Changes
Over time, products' quality frequently changes:

-Quality Adjustment: The index may be distorted if changes in product quality are not taken into
consideration. For instance, the index ought to show an improvement in the quality of a product
even if the price stays the same.

6. New Products and Discontinuation


The market is dynamic, with new items coming out and older ones being phased out:

-Adding New Products: It can be difficult to decide whether and how to add new products to the
index. The quality and cost of new products can vary greatly from those of existing ones.
Handling Discontinued Products: Finding appropriate replacements for discontinued products can
be challenging.

7. Mathematical Formula
Results from different formulas can vary:

-Selecting a Formula: There are a number of formulas (Laspeyres, Paasche, Fisher, etc.) with
pros and cons for each. The weights given to the items and the index's sensitivity to price
fluctuations can both be impacted by the formula selection.
-Consistency: Although it's not always guaranteed, the formula should yield results that are
consistent over time.

8. Substitution Bias
Price changes may cause consumers to alter their buying habits:
-Accounting for Substitution: Due to potential biases, traditional fixed-weight indices may not
accurately reflect consumer substitution between goods.

ANS 6 A.
Meaning of Sampling Method
Statistical sampling is the process of choosing a subset (sample) of people, objects, or data points
from a larger population in order to estimate the characteristics of the entire population. This
method is frequently used in surveys, research, and quality control to draw conclusions without
looking at the complete population, which can be difficult or impossible because of financial, time,
or resource limitations.

Principles of Sampling
-Representativeness: The sample should fairly represent the population's features. This
guarantees the generalizability of the conclusions made from the sample to the entire population.
-Randomization: Every individual in the population ought to have an equal probability of being
selected for the sample. In addition to ensuring that the sample is representative, this lessens bias.
-Adequate Sample Size: The sample size should be large enough to capture the population's
diversity and provide reliable estimates. Larger samples typically yield more accurate and
reliable results.
-High precision (low variability) and accuracy (closeness to the true population value) are
desirable outcomes for sampling techniques.
-Cost-Effectiveness: The sampling strategy should be economical, taking into account both the
resources that are available and the requirement for accuracy and precision.
-Efficiency: The procedure should be as quick and easy to use as possible in order to obtain the
sample.

Different Sampling Techniques


1.Probability Sampling: Every member of the population has a known, non-zero chance of being
selected.

-Simple Random Sampling: There is an equal chance of selection for every member of the
population.
-Stratified Sampling: Random samples are drawn from each stratum after the population is split
into groups or strata according to a particular trait.
-Systematic Sampling: Following a random starting point, every nth person in the population is
chosen.
-Cluster sampling involves grouping the population into clusters, some of which are chosen at
random, and then sampling every member of the selected clusters.
2.Non-Probability Sampling: Not every member has an equal chance of being included because
members are chosen using non-random criteria.

-Convenience Sampling: Samples are chosen based on ease of access.


-Purposive or judgmental sampling: Samples are chosen according to the researcher's assessment.
-Quota Sampling: The population is divided into subgroups that are mutually exclusive, and
samples are drawn from each group to satisfy a predetermined quota.
-Snowball sampling: Current research participants enlist new participants from their social
networks. B.
Acceptance Sampling Plan
A quality control technique called acceptance sampling is used to determine whether to accept or
reject a batch of products based only on a sample. This technique aids in ascertaining whether a
batch satisfies predetermined requirements and standards.

Steps in Acceptance Plan for Sampling


-Describe the Lot: Ascertain the lot's dimensions and makeup (batch of products).
-Calculate the Sample Size: Determine how many items from the lot will be sampled. -Establish
Acceptance Criteria: Establish acceptance criteria, such as the maximum amount of sample
defects that are acceptable.
-Choose the Example: Choose the items from the lot at random based on the sample size.
-Examine the Sample: Check the sample items you've chosen for flaws or non-conformance.
-Make the Decision:
-Accept the Lot: Provided the quantity of faulty goods falls within the acceptable range.
-If there are more defective items than meet the acceptance criteria, reject the lot.

Principles of Acceptance Sampling


Objective: The main goal is to guarantee product quality while reducing the expense of inspection.
Risk Management: Balances the risk between the producer (Type I error: rejecting a good lot) and
the consumer (Type II error: accepting a bad lot).
Efficiency: Seek to minimize the quantity of items examined while preserving faith in the lot's
quality.
Standardization: To guarantee consistency and dependability in the acceptance process, use
standardized sampling plans (such as MIL-STD-105).
Cost-Effectiveness: Aims to attain quality assurance without requiring a 100% inspection in a way
that is both economical and efficient.

Acceptance Sampling Plan Types


1.Single Sampling Plan: Based on the quantity of flaws in the sample, a single sample is taken and
the lot is either accepted or rejected.
2. Double Sampling Strategy: A baseline sample is obtained. Before making a choice, a
second sample is taken if the results are unclear.
3. Multiple Sampling Plan: A number of samples are collected in phases, and at each stage,
the lot is either accepted or rejected based on the total of the individual samples.
4. Sequential Sampling Plan: Each item is sampled separately, and choices are made during
the inspection process, with the option to end the inspection early if a decision is made.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy