Factor analysis is an arithmetical technique used to describe variability regarding observed variables with regard to lower number of unobserved variables. Factor analysis looks for such joint variations in response to unobserved hidden variables. The observed variables are molded as linear combinations of potential factors including the error terms.
Information attained regarding interdependence between observed factors can later be utilized to reduce the set of variables within a dataset. Factor analysis originated in psychometrics and is applied in behavioral sciences operations research and applied sciences which deal with large quantities of data. In psychology, factor analysis is in most cases associated with intelligence search. Factor analysis has been used to search for factors within a broad range of spheres such as character, beliefs and attitudes.
Factor analysis isolates the underlying variables that make clear the data. There are two types of factor analysis; principal factor analysis and common factor analysis. The factors generated by principal factor analysis are theoretical as being as liner combinations of variables whereas those generated by common factor analysis are theoretical latent variables. Computationally, the main difference is that the diagonal relationship matrix is substituted with common variables in common factor analysis.
Structure Church & Dwight’s has a functional structure consisting of James R. Craigie as Chairman and CEO followed by ten Executive Vice Presidents (EVP) leading various field of business within the company. The corporate strategy is to continue to lead their competitors in Total Shareholder Returns (TSR). This can be accomplished by maintaining their strong position domestically while ...
Factor analysis is performed through examining the pattern of connection between the observed variables. Variables which are highly related have a likelihood of being influenced by factors such as those which are moderately unrelated and have a more likelihood of being influenced by different factors.
Principal component analysis is the most widespread factor analysis. Principal factor analysis seeks for a linear combination of measures in such a way that the maximum difference is extracted form the measures. It then removes the difference and search for a second liner a combination that explains the maximum proportion of the remaining variance.
Conducting a Confirmatory Factor Analysis
The main purpose of a Confirmatory Factor Analysis is to establish the ability of a prearranged variable model to fit within an observed set of data. Among the normal uses of Confirmatory Factor Analysis include; establishing the weight of a single factor representation compares the ability of two differing models to account for the same set of data, test the significance of particular factor loading, test the connection between two or more factor loadings and also to evaluate the convergent and discriminate strength of a set of measures.
Conducting a Confirmatory Factor Analysis
The six stages involved include;
Describing the factor model which is the first thing required to be done accurately to define the model one wants to test. This involves choosing the number of factors and defining the nature of loadings between measures and factors. The loading can be fixed at zero or any other constant number or allowed to vary within specified constraints.
Collect the measurements through measurement of variables on same experimental units.
Obtain a correlation matrix by getting the correlation between each of the variables.
The precise definition of personality has been a point of discussion amongst many different theorists within many different disciplines since the beginning of civilisation. Personality can be defined as "the distinctive and characteristic patterns of thought, emotion, and behaviour that define an individual's personal style and influence his or her interactions with the environment" (Atkinson, ...
Fit the model into data by selecting a method to obtain the estimates of factor loadings which were free to vary. The normal model-fitting method is the Maximum likelihood estimation that needs to be used unless the measures serious lack multivariate normality. In such a case one can use Asymptotically distribution free estimation.
Evaluation of model adequacy s done when the factor model is fit the data, the factor loading are selected to minimize the difference between the correlation matrix implied by the model and the actual observed matrix. The amount of difference after the best parameters have been selected can be used as a measure as to how reliable the reproduction is with the data.
The commonly used assessment of model adequacy is the X2 goodness of fit test. Null hypothesis for this test holds that the model sufficiency for the data, while the other is that there a significant level f differences. Regrettably, this test is highly sensitive to sample size since, tests used in testing large samples generally lead to a rejection of null hypothesis, even when factor model is suitable. Other statistics like the Tucker-Lewis index, compare the fitness of planned model to a null representation. These statistics show less sensitivity to sample size.
By comparing these two models with other model one can is able observe the difference between their X 2 statistics which is almost equal to X2 distribution. About al individual factor loading tests can be compared to reduced and full factor models. In situations where there is no comparison of full and reduced models, use of Root mean square error of approximation is recommended which is n estimation of discrepancy per degree of freedom within the model.
DeCoster, J. (1998).
Overview of Factor Analysis. Retrieved on August, 16, 2010 from http://www.stat-help.co/notes.html