Resaerch Design And Statistics

Read Complete Research Material

RESAERCH DESIGN AND STATISTICS

PSM403: Research Design and Statistics



PSM403: Research Design and Statistics

Section 1

Multicollinearity

Multicollinearity describes a situation when correlation exists between independent variables. It is a statistical problem when predictor variables are highly linearly related that changes the constant and slope of regression. If multicollinearity situation arises coefficient may change intermittently with small scale change in independent variables. It is an undesirable situation as correlation between predicting variables result in changing the relation with dependent variable. Perfect multicollinearity exists when correlation between two independent variables is equal to 1 or -1 (Balnaves, 2007). In practical terms, perfect multicollinearity situation occur very rarely. It mainly occurs due to strong linear relationship between two or more independent variables.

Standard errors of the coefficients increase due to multicollinearity. Increased standard errors in turn mean that coefficients for some independent variables may be found not to be significantly different from 0. This makes some variable statistically insignificant which otherwise may have great implication in research (Jackson, 2008). To assess the multicollinearity in data, Variance Inflation Index and Tolerance are used. High value of Tolerance (>0.2) and low value of VIF (near to 1) shows that Collinearity level is minimum or does not exist (Hardle, 2007).

Section 2

2.1) Identify the design of this study.

GLM Univariate Analysis technique has been applied in this case. Basically, GLM includes regression analysis and ANOVA for one dependant variable to determine impact of one or more factors. When population is distributed into groups, General Linear Model procedure helps in testing the null hypotheses about the effects of other variables on the means of various groupings of a single dependent variable (Corbin, 2008).

You can investigate interactions between factors as well as the effects of individual factors, some of which may be random. In addition, the effects of covariates and covariate interactions with factors can be included. For regression analysis, the independent (predictor) variables are specified as covariates.

2.2) Interpret the output.

GLM models test the interaction between factors including the effect of individual factors. Mauchly's test of sphericity examines the sphericity between different factors include in the data. Sphericity refers to homogeneity in variances between different levels of repeated measures.

Mauchly's test of sphericity examines variances in matrix variables whether they are equal at different point in time or not. If this test shows significant results this means that sphericity component has not been addressed in the data which requires adjustment through different techniques which include Greenhouse-Geisser, Huynh-Feldt, and lower bound (Creswell, 2009). If Mauchly's test is not significant this concludes that variances of data are equal.

Test of within-subjects effects: This table displays univariate tests under each of the three possible epsilon adjustments described above. If Mauchly's test does not violate the sphericity principle; this table would be interpreted in terms of the "Sphericity assumed" rows. The partial eta squared statistic reports the practical significance of each term, based upon the ratio of the variation (sum of squares) accounted for by the term, to the sum of the variation accounted for by ...
Related Ads