5 December 2008
From my point of view, an applied quantitative social science study is usually a process containing three parts. The first part is about theoretical/formal modeling (with either explicit or implicit assumptions), the second about deriving empirical implications from the model and the last about applying (or inventing in some cases) appropriate statistical methods to collect evidence and evaluate the derived empirical implications.
Professor Liberson and his coauthor in a recent article that I will point out below called this entire process as implication analysis, while previously for me, I tend to think implication analysis is only the second part of this process, something like comparative static and dynamic analysis, etc. But given that some of us and probably more of us are increasingly interested in producing works by integrating the above process, it seems natural to give a name to this integrated approach, as compared to formal analysis and empirical/statistical analysis.
Certainly, the integrated approach increases the complexity of research, as there are many things can go wrong between theory and data. A symposium on implication analysis, started with Stanley Lieberson and Joel Horwich's paper, "Implication Analysis: A Pragmatic Proposal for Linking Theory and Data in the Social Sciences," and followed by five response papers in the latest Sociological Methodology (Volume 38 Issue 1, December 2008), tries to address some of these issues, including specification of testable hypotheses, assessment of data quality, validation of estimates in different contexts, dealing with inconsistent evidence, etc.
Just FYI, Washington University's Weidenbaum Center and Department of Political Science will sponsor a new summer institute on Empirical Implications of Theoretical Models in politics in 2009.
Here is the institute's website. http://wc.wustl.edu/eitm.html