0% found this document useful (0 votes)
35 views18 pages

MSCP HND Computing Lo3

The document discusses various techniques for collecting and analyzing data, including: - Observational data collection methods like surveys, inspections, and assessments. - Common data collection techniques like surveys, interviews, and focus groups, along with their advantages and disadvantages. - Steps for analyzing quantitative data, including data preparation, validation, editing, and coding. - Popular data analysis tools like Excel, Tableau, Power BI, R, and Python. It also discusses coding and typologies as important steps in qualitative data analysis.

Uploaded by

nayaki22 na
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views18 pages

MSCP HND Computing Lo3

The document discusses various techniques for collecting and analyzing data, including: - Observational data collection methods like surveys, inspections, and assessments. - Common data collection techniques like surveys, interviews, and focus groups, along with their advantages and disadvantages. - Steps for analyzing quantitative data, including data preparation, validation, editing, and coding. - Popular data analysis tools like Excel, Tableau, Power BI, R, and Python. It also discusses coding and typologies as important steps in qualitative data analysis.

Uploaded by

nayaki22 na
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 18

LO3

Present the project and communicate appropriate recommendations based on meaningful conclusions drawn from the evidence
findings and/or analysis

• Communicating outcomes

• Convincing arguments

• Critical and objective analysis and evaluation


Data Collection Techniques
• Observations
Making direct observations is a simple and unobtrusive way of collecting data. Gathering first hand information
in the field gives the observer a holistic perspective that helps them to understand the context in which the item
being studied operates or exists. The observations are recorded in field notes or on a mobile device if the
observer is collecting data electronically (like with Fulcrum).
Some examples of observational data collection are building inspections, safety checklists, agricultural surveys,
and damage assessments.
Observation is an effective method because it is straightforward and efficient: It doesn’t typically require
extensive training on the part of the data collector, and he or she is generally not dependent on other participants.
The biggest drawback of observational data is that it tends to be superficial and lack the context needed to
provide a complete picture.
• Surveys / Questionnaires:
Questionnaires are a popular means of data collection because they are inexpensive and can provide a broad perspective.
They can be conducted face-to-face, by mail, telephone, or Internet (in which case, they can include respondents from anywhere
in the world).
Surveys are often used when information is sought from a large number of people or on a wide range of topics (where in-depth
responses are not necessary).
They can contain yes/no, true/false, multiple choice, scaled, or open-ended questions — or all of the above. The same survey can
be conducted at spaced intervals to measure change over time.
Some of the advantages of surveys are that respondents can answer questions on their own time, and may answer more honestly
as questionnaires provide anonymity (whether real or perceived). And while the responses may be biased on the part of the
participant, they are free from the collector’s bias.
The main drawbacks are low response rate, delay in response, and the possibility of ambiguous or missing answers (and since
questionnaires are a passive tool, it’s usually not possible to receive clarification)
Interviews
Interviews can be conducted in person or by phone, and can be structured (using
survey forms) or unstructured.
The downsides are that interviews require time and money to plan and execute —
including interviewer training — and they require more cooperation on the part of the
interviewee, who may be uncomfortable sharing personal information.

But there are also many benefits to interviews: They don’t require the literacy on the
part of the respondents, for one thing.
For another, they allow the interviewer(especially a well-trained one) to uncover deep
insight by clarifying and deep-diving into the respondent’s answers, as well as by
collecting nonverbal data.
Telephone interviews are less expensive than in-person interviews, and provide
access to anyone in the world with a phone. They also provide a measure of
anonymity that may encourage the respondent to be more forthcoming with their
answers. But they lack the rich data of face-to-face interaction
Focus Groups
• A focus group is simply a group interview of people who all have something in common.
They provide the same type of data as in-person interviews, but add a social element and
offer a broader understanding of why a group thinks or behaves in a particular way.
• Focus groups are useful when examining cultural values or other complex issues, but also
have their drawbacks. Lack of privacy or anonymity can present a major obstacle, as can
“group think,” or the potential for the group to be dominated by one or two participants.
• These sessions can be time-consuming and difficult, and require a leader who is skilled at
creating a relaxed, welcoming environment, drawing out passive participants, and even
dealing with conflict.
• While those are the four most common data collection techniques, there are as many
collection methods as there are types of data, such as self-reporting, document review,
testing, oral histories, and case studies — just to name a few
Data analysis method

• Analyzing Quantitative Data


Data Preparation The first stage of analyzing data is data preparation, where the
aim is to convert raw data into something meaningful and readable.
It includes three steps: Step 1: Data Validation The purpose of data validation is to
find out, as far as possible, whether the data collection was done as per the pre-set
standards and without any bias. It is a four-step process, which includes
• Fraud, to infer whether each respondent was actually interviewed or not.
• Screening, to make sure that respondents were chosen as per the research criteria.
• Procedure, to check whether the data collection procedure was duly followed.
• Completeness, to ensure that the interviewer asked the respondent all the questions,
rather than just a few required ones.

To do this, researchers would need to pick a random sample of completed surveys and
validate the collected data. (Note that this can be time-consuming for surveys with lots
of responses.) For example, imagine a survey with 200 respondents split into 2 cities.
The researcher can pick a sample of 20 random respondents from each city. After this,
the researcher can reach out to them through email or phone and check their responses to
a certain set of questions.
• Step 2: Data Editing Typically, large data sets include errors. For example,
respondents may fill fields incorrectly or skip them accidentally.
• To make sure that there are no such errors, the researcher should conduct basic
data checks, check for outliers, and edit the raw research data to identify and clear
out any data points that may hamper the accuracy of the results.
• Step 3: Data Coding This is one of the most important steps in data
preparation. It refers to grouping and assigning values to responses from the
survey. For example, if a researcher has interviewed 1,000 people and now
wants to find the average age of the respondents, the researcher will create
age buckets and categorize the age of each of the respondent as per these
codes. (For example, respondents between 13-15 years old would have their
age coded as 0, 16-18 as 1, 18-20 as 2, etc.) Then during analysis, the
researcher can deal with simplified age brackets, rather than a massive
range of individual ages
Data Analysis Tools

• Excel
• Tableau
• Power BI
• Fine Report
• R & Python
Data Analysis Tools
Excel
• It has a variety of compelling features, and with additional plugins installed, it can handle a
massive amount of data. So, if you have data that does not come near the significant data margin,
then Excel can be a very versatile tool for data analysis.
Tableau
• It falls under the BI Tool category, made for the sole purpose of data analysis. The essence of
Tableau is the Pivot Table and Pivot Chart and works towards representing data in the most user-
friendly way. It additionally has a data cleaning feature along with brilliant analytical functions.
Power BI
• It initially started as a plugin for Excel, but later on, detached from it to develop in one of the
most data analytics tools. It comes in three versions: Free, Pro, and Premium. Its PowerPivot
and DAX language can implement sophisticated advanced analytics similarto writing Excel
formulas.
Fine Report
• Fine Report comes with a straightforward drag and drops operation, which helps to design
various styles of reports and build a data decision analysis system. It can directly connect to all
kinds of databases, and its format is similar to that of Excel. Additionally it also provides a
variety of dashboard templates and several self-developed visual plug-in libraries.
• R & Python These are programming languages which are very powerful and flexible.
R is best at statistical analysis, such as normal distribution, cluster classification
algorithms, and regression analysis. It also performs individual predictive analysis
like customer behavior, his spend, items preferred by him based on his browsing
history, and more. It also involves concepts of machine learning and artificial
intelligence.  SAS It is a programming language for data analytics and data
manipulation, which can easily access data from any source. SAS has introduced a
broad set of customer profiling products for web, social media, and marketing
analytics. It can predict their behaviors, manage, and optimize communications
Coding and Typologies
• This is the next major stage of qualitative data analysis. • It is here that you
carefully read your transcribed data, line by line, and divide the data into
meaningful analytical units (i.e., segmenting the data). When you locate
meaningful segments, you code them. • Coding is defined as marking the
segments of data with symbols, descriptive words, or category names. Again,
whenever you find a meaningful segment of text in a transcript, you assign a code
or category name to signify that particular segment. You continue this process
until you have segmented all of your data and have completed the initial coding.
During coding, you must keep a master list (i.e., a list of all the codes that are
developed and used in the research study). Then, the codes are reapplied to new
segments of data each time an appropriate segment is encountered.
• Typology is a composite measure that involves the classification of observations in terms of their
attributes on multiple variables.[1] Such classification is usually done on a nominal scale.[1]
Typologies are used in both qualitative and quantitative research.

• An example of a typology would be classification such as by age and health: young-healthy, young-
sick, old-healthy, old-sick.

• Typological theorizing is the development of theories about configurations of variables that


constitute theoretical types.[2] According to Andrew Bennett and Alexander George, typological
theories are useful "to address complex phenomena without oversimplifying, clarify similarities and
differences among cases to facilitate comparisons, provide a comprehensive inventory of all possible
kinds of cases, incorporate interactions effects, and draw attention to... kinds of cases that have
not occurred.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy