Article ID: | iaor1993484 |
Country: | United States |
Volume: | 11 |
Issue: | 1 |
Start Page Number: | 76 |
End Page Number: | 94 |
Publication Date: | Dec 1992 |
Journal: | Marketing Science |
Authors: | Farris Paul W., Parry Mark E., Ailawadi Kusum L. |
Keywords: | information theory |
Composite variables are those that may be mathematically decomposed into additive and/or multiplicative component variables. Several researchers have noted that the relationship between a composite variable and its components may be a mathematical artifact, but the effect of their inclusion as independent variables on the coefficients of the remaining variables in the model has not been recognized, nor has a formal expression for the resulting bias been presented. Structural analysis of composite dependent variables, as presented here, provides the key to understanding the nature and extent of this bias. It separates higher and lower level components from one another and also separates these components from other antecedent variables in the model. The advantages of this hierarchical decomposition are that (1) it reduces problems of misspecification and omitted variables, (2) by separately estimating antecedent effects on each component, it offers some insights into the underlying causal mechanisms that are not available from other techniques and (3) it ensures that regression coefficients can be interpreted in the standard way: the expected change in the dependent variable associated with a change in the independent variable, holding other independent variables in the equation constant. Moreover, hierarchical decomposition of the dependent variable can reproduce all information available from technical that mix levels of analysis, but the converse is not true.