Conditional inference

dc.contributor.authorSenyonyi, John M.
dc.date.accessioned2017-12-14T16:48:51Z
dc.date.available2017-12-14T16:48:51Z
dc.date.issued1984-08
dc.descriptionThis thesis was submitted for the degree of Doctor of Philosophy in the Department of Statistics, University of Melbourne. Available at: https://minerva-access.unimelb.edu.au/handle/11343/36921en_US
dc.description.abstractConditional inference is a branch of statistical inference in which observed data is reduced using either sufficient or ancillary statistics. This often simplifies inference about the parameters. In comparison to full likelihood methods, conditional inference theory’s performance still needs validating in many areas. Some of these are the concern of this thesis. While the definition of an ancillary statistic in single parameter models is unequivocal, the presence of accessory (or nuisance) parameters in a model presents problems in defining an ancillary statistic. Statistical literature abounds with definitions of ancillarity in this case. Some of the commonest and most useful of these are discussed and shown to be interrelated. This facilitates the choice of the strongest eligible ancillary in a problem, i.e. that which offers the biggest reduction of the sample space. The Pitman-Morgan test for variance ratios in bivariate normal populations with unknown correlation coefficient is shown to be a conditional test. We condition on sufficient statistics for the accessory parameters to eliminate them. The test statistic is then derived as an ancillary statistic for the accessory parameters. When a probability model depends on a number of accessory parameters which increases with the sample size, estimation methods based on the full likelihood will often be inconsistent. Using a partial likelihood instead has been suggested. Local maximum partial likelihood estimators are shown to exist, and to be consistent and asymptotically normal under mild conditions. These results also cover conditional and marginal likelihoods, thus considerably strengthening earlier results in this area. In planning statistical inferences, it is useful to choose a sampling scheme which provides only the essential data to our inferences. Jagers’ lemma proposes very general conditions under which maximum likelihood estimation from a subset of the data is identical with that from the full data. However, the lemma is incorrect as given. We show that an additional sufficiency condition repairs the lemma. It is further shown that this lemma cannot be extended to general exponential families.en_US
dc.identifier.citationSenyonyi-Mubiru, J. M. (1984). Conditional inference. PhD thesis, Dept. of Statistics, The University of Melbourne.en_US
dc.identifier.urihttps://hdl.handle.net/20.500.11951/78
dc.language.isoenen_US
dc.subjectConditional inferenceen_US
dc.subjectSufficient statisticsen_US
dc.subjectAncillary statisticsen_US
dc.titleConditional inferenceen_US
dc.typePhD Thesisen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Senyonyi_thesis_1984.pdf
Size:
4.35 MB
Format:
Adobe Portable Document Format
Description:
Thesis, University of Melbourne
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: