Affiliation:
1. Texas Woman's University
2. University of Notre Dame
Abstract
Howard and his colleagues have discovered an instrumentation related contamination which confounds the results of studies which employ self-report measures in a pre/post or posttest only design. This confounding influence is referred to as response-shift bias. Research has demonstrated that the traditional methods of analysis (i.e., analysis of posttests only, analysis of pre/post difference scores, and analysis of covariance using prescores (ANCOVA)) do not consider response-shift bias and produce biased estimates of the treatment effect. A retrospective pre/post design is recommended by Howard and his colleagues to control for response-shift bias. The only method of analysis which yields an unbiased estimate of the treatment effect is posttest minus retrospective pretest difference scores. The purpose of the present study is to determine the relative loss in statistical power of the traditional methods of analysis when response-shift bias is present. Analytic and Monte Carlo techniques were employed to compare the powers of five methods of analysis under various conditions. The results indicate that when there is a response-shift the most powerful method of analysis, overall, is the retrospective pre/post method and the loss in statistical power of the traditional methods can be substantial under many conditions. Recommendations and applications to applied research are discussed.
Subject
Applied Mathematics,Applied Psychology,Developmental and Educational Psychology,Education
Cited by
54 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献