...

Chapter 12 Examining Relationships in Quantitative Research

by user

on
Category: Documents
13

views

Report

Comments

Transcript

Chapter 12 Examining Relationships in Quantitative Research
Chapter 12
Examining Relationships in Quantitative
Research
McGraw-Hill/Irwin
Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.
Learning Objectives
• Understand and evaluate the types of
relationships between variables
• Explain the concepts of association and covariation
• Discuss the differences between Pearson
correlation and Spearman correlation
12-2
Learning Objectives
• Explain the concept of statistical significance
versus practical significance
• Understand when and how to use regression
analysis
12-3
Examining Relationships between
Variables
• Relationships between variables can be
described through:
– Presence
– Direction
12-4
Examining Relationships between
Variables
– Strength of association:
•
•
•
•
No relationship
Weak relationship
Moderate relationship
Strong relationship
12-5
Examining Relationships between
Variables
– Type
• Linear relationship: An association between
two variables whereby the strength and
nature of the relationship remains the same
over the range of both variables
• Curvilinear relationship: A relationship
between two variables whereby the
strength and/or direction of their
relationship changes over the range of both
variables
12-6
Covariation and Variable Relationships
• Covariation: The amount of change in one
variable that is consistently related to the
change in another variable of interest
– Scatter diagram: A graphic plot of the relative
position of two variables using a horizontal and a
vertical axis to represent the values of the
respective variables
• A way of visually describing the covariation between
two variables
12-7
Exhibit 12.1 - No Relationship between
X and Y
12-8
Exhibit 12.2 - Positive Relationship
between X and Y
12-9
Exhibit 12.3 - Negative Relationship
between X and Y
12-10
Exhibit 12.4 - Curvilinear Relationship
between X and Y
12-11
Correlation Analysis
• Pearson correlation coefficient: Statistical
measure of the strength of a linear
relationship between two metric variables
– Varies between – 1.00 and 1.00
• 0 represents absolutely no association between two
variables
• – 1.00 or 1.00 represent a perfect link between two
variables
• Correlation coefficient can be either positive or
negative
12-12
Exhibit 12.5 - Rules of Thumb about the Strength
of Correlation Coefficients
12-13
Assumptions for Calculating
Pearson’s Correlation Coefficient
• The two variables have been measured using
interval- or ratio-scaled measures
• Nature of the relationship to be measured is
linear
– A straight line describes the relationship between
the variables of interest
• Variables to be analyzed need to be from a
normally distributed population
12-14
Exhibit 12.6 - SPSS Pearson Correlation
Example for Santa Fe Grill Customers
12-15
Substantive Significance of the
Correlation Coefficient
• Coefficient of determination (r2): A number
measuring the proportion of variation in one
variable accounted for by another
– Can be thought of as a percentage and varies from
0.0 to 1.00
– The larger the size of the coefficient of
determination, the stronger the linear relationship
between the two variables being examined
12-16
Influence of Measurement Scales on
Correlation Analysis
• Spearman rank order correlation coefficient:
A statistical measure of the linear association
between two variables where both have been
measured using ordinal (rank order) scales
12-17
Exhibit 12.7 - SPSS Spearman Rank Order
Correlation
12-18
Exhibit 12.8 - Median Example for
Restaurant Selection Factors
12-19
What is Regression Analysis?
• A method for arriving at more detailed
answers (predictions) than can be provided by
the correlation coefficient
• A number of ways to make such predictions:
– Extrapolation from past behavior of the variable
– Simple guesses
– Use of a regression equation that includes
information about related variables to assist in the
prediction
12-20
What is Regression Analysis?
• Bivariate regression analysis: A statistical
technique that analyzes the linear relationship
between two variables by estimating
coefficients for an equation for a straight line
– One variable is designated as a dependent
variable
– The other is called an independent or predictor
variable
12-21
What is Regression Analysis?
• Use of a simple regression model assumes:
– Variables of interest are measured on interval or
ratio scales
– Variables come from a normal population
– Error terms associated with making predictions
are normally and independently distributed
12-22
Fundamentals of Regression Analysis
• General formula for a straight line:
• Where,
– Y = The dependent variable
– a = The intercept (point where the straight line
intersects the Y-axis when X = 0)
– b = The slope (the change in Y for every 1 unit
change in X )
– X = The independent variable used to predict Y
– ei = The error of the prediction
12-23
Exhibit 12.9 - The Straight Line
Relationship in Regression
12-24
Least Squares Procedure
• A regression approach that determines the best-fitting line by
minimizing the vertical distances of all the points from the line
Unexplained Variance
• The amount of variation in the dependent variable that cannot
be accounted for by the combination of independent variables
12-25
Exhibit 12.10 - Fitting the Regression Line
Using the “Least Squares” Procedure
12-26
Ordinary Least Squares
• A statistical procedure that estimates regression equation coefficients
that produce the lowest sum of squared differences between the actual
and predicted values of the dependent variable
Regression Coefficient
• An indicator of the importance of an independent variable in predicting a
dependent variable
• Large coefficients are good predictors and small coefficients are weak
predictors
12-27
Exhibit 12.11 - SPSS Results for
Bivariate Regression
12-28
Significance of
Regression Coefficients
• Is there a relationship between the dependent
and independent variable?
• How strong is the relationship?
12-29
Multiple Regression Analysis
• A statistical technique which analyzes the
linear relationship between a dependent
variable and multiple independent variables
by:
– Estimating coefficients for the equation for a
straight line
12-30
Beta Coefficient
• An estimated regression coefficient that has
been recalculated to have a mean of 0 and a
standard deviation of 1
– Such a change enables independent variables with
different units of measurement to be directly
compared on their association with the
dependent variable
12-31
Examining the Statistical Significance
of Each Coefficient
• Each regression coefficient is divided by its
standard error to produce a t statistic
– Which is compared against the critical value to
determine whether the null hypothesis can be
rejected
12-32
Examining the Statistical Significance
of Each Coefficient
• Model F statistic: Compares the amount of
variation in the dependent measure
“explained” or associated with the
independent variables to the “unexplained” or
error variance
– A larger F statistic indicates that the regression
model has more explained variance than error
variance
12-33
Substantive Significance
• The multiple r2 describes the strength of the
relationship between all the independent
variables and the dependent variable
– The larger the r2 measure, the more of the
behavior of the dependent measure is associated
with the independent measures we are using to
predict it
12-34
Multiple Regression Assumptions
• Linear relationship
• Homoskedasticity: The pattern of the covariation is constant (the same) around the
regression line, whether the values are small,
medium, or large
– Heteroskedasticity: The pattern of covariation
around the regression line is not constant around
the regression line, and varies in some way when
the values change from small to medium and large
12-35
Multiple Regression Assumptions
• Normal distribution
– Normal curve: A curve that indicates the shape of
the distribution of a variable is equal both above
and below the mean
12-36
Exhibit 12.12 - Example of
Heteroskedasticity
12-37
Exhibit 12.13 - Example of a Normal
Curve
12-38
Exhibit 12.14 - SPSS Results for
Multiple Regression
12-39
Multicollinearity
• A situation in which several independent variables
are highly correlated with each other
• Can result in difficulty in estimating independent
regression coefficients for the correlated variables
12-40
Marketing Research in Action:
The Role of Employees in Developing a Customer
Satisfaction Program
• Will the results of this regression model be useful
to the QualKote plant manager?
• If yes, how?
• Which independent variables are helpful in
predicting A36–Customer Satisfaction?
• How would the manager interpret the mean
values for the variables reported in Exhibit 12.16?
• What other regression models might be
examined with the questions from this survey?
12-41
Fly UP