Tags
Language
Tags
December 2024
Su Mo Tu We Th Fr Sa
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31 1 2 3 4

Missing Data and Small-Area Estimation: Modern Analytical Equipment for the Survey Statistician (Repost)

Posted By: AvaxGenius
Missing Data and Small-Area Estimation: Modern Analytical Equipment for the Survey Statistician (Repost)

Missing Data and Small-Area Estimation: Modern Analytical Equipment for the Survey Statistician by Nicholas T. Longford
English | PDF | 2005 | 357 Pages | ISBN : 1852337605 | 2.45 MB

This book develops methods for two key problems in the analysis of large-scale surveys: dealing with incomplete data and making inferences about sparsely represented subdomains. The presentation is committed to two particular methods, multiple imputation for missing data and multivariate composition for small-area estimation. The methods are presented as developments of established approaches by attending to their deficiencies.
Thus the change to more efficient methods can be gradual, sensitive to the management priorities in large research organisations and multidisciplinary teams and to other reasons for inertia. The typical setting of each problem is addressed first, and then the constituency of the applications is widened to reinforce the view that the general method is essential for modern survey analysis. The general tone of the book is not "from theory to practice," but "from current practice to better practice." The third part of the book, a single chapter, presents a method for efficient estimation under model uncertainty. It is inspired by the solution for small-area estimation and is an example of "from good practice to better theory."

A strength of the presentation is chapters of case studies, one for each problem. Whenever possible, turning to examples and illustrations is preferred to the theoretical argument. The book is suitable for graduate students and researchers who are acquainted with the fundamentals of sampling theory and have a good grounding in statistical computing, or in conjunction with an intensive period of learning and establishing one's own a modern computing and graphical environment that would serve the reader for most of the analytical work in the future.

While some analysts might regard data imperfections and deficiencies, such as nonresponse and limited sample size, as someone else's failure that bars effective and valid analysis, this book presents them as respectable analytical and inferential challenges, opportunities to harness the computing power into service of high-quality socially relevant statistics.

Overriding in this approach is the general principle—to do the best, for the consumer of statistical information, that can be done with what is available. The reputation that government statistics is a rigid procedure-based and operation-centred activity, distant from the mainstream of statistical theory and practice, is refuted most resolutely.

After leaving De Montfort University in 2004 where he was a Senior Research Fellow in Statistics, Nick Longford founded the statistical research and consulting company SNTL in Leicester, England. He was awarded the first Campion Fellowship (2000–02) for methodological research in United Kingdom government statistics. He has served as Associate Editor of the Journal of the Royal Statistical Society, Series A, and the Journal of Educational and Behavioral Statistics and as an Editor of the Journal of Multivariate Analysis. He is a member of the Editorial Board of the British Journal of Mathematical and Statistical Psychology. He is the author of two other monographs, Random Coefficient Models (Oxford University Press, 1993) and Models for Uncertainty in Educational Testing (Springer-Verlag, 1995).
Without You And Your Support We Can’t Continue
Thanks For Buying Premium From My Links For Support