Sampling with unequal selection probabilities

From AWF-Wiki
Revision as of 11:11, 18 January 2011 by Fehrmann (Talk | contribs)

Jump to: navigation, search
Forest Inventory lecturenotes
Category Forest Inventory lecturenotes not found



Introduction

This article is, if not explicitly stated otherwise, based upon the lecture notes for the teaching modul "Forest Inventory" by Kleinn et al. (2007[1]).

Mostly, one speaks about random sampling with equal selection probabilities: each element of the population has the same probability to be selected. However, there are situations in which this idea of equal selection probabilities does not appear reasonable: if it is known that some elements carry much more information about the target variable, they should also have a greater chance to be selected. Stratification goes into that direction: there, the selection probabilities within the strata were the same, but could be different between strata.


Sampling with unequal selection probabilities is still random sampling, but not simple random sampling, but “random sampling with unequal selection probabilities”. These selection probabilities, of course, must be defined for each and every element of the population before sampling and none of the population elements must have a selection probability of 0.

Various sampling strategies that are important for forest inventory base upon the principle of unequal selection probabilities, including



In unequal probability sampling, we distinguish two different probabilities – which actually are two different points of view on the sampling process:

The selection probability is the probability that element i is selected at one draw (selection step). The Hansen-Hurwitz estimator for sampling with replacement (that is; when the selection probabilities do not change after every draw) bases on this probability. The notation for selection probability is written as \(P_i\) or \(p_i\).

The inclusion probability refers to the probability that element i is eventually (or included) in the sample of size n. The Horvitz-Thompson estimator bases on the inclusion probability and is applicable to sampling with or without replacement. The inclusion probability is generally denoted by \(\pi\).


info.png obs:
A typical example forsampling with equal inclusion probabilities is given with fixed areasample plots in forest inventories. With this concept and under theassumption that sample points are randomly distributed over an area ofinterest, each tree has the same probability to become part of asample. Contrary to this constant inclusion probability it ispossible to weight the probability proportional to a meaningfulvariable. Imagine e.g. different plot sizes for different treedimensions. If bigger trees are observed in larger plots and smallertrees in smaller plots, their probability to be included in a sample isnot constant anymore. This weighting is in particular efficient, if theinclusion probability is proportional to the respective target variable(like e.g. in relascope sampling)

References

  1. Kleinn, C.2007. Lecture Notes for the Teaching Module Forest Inventory. Department of Forest Inventory and Remote Sensing. Faculty of Forest Science and Forest Ecology, Georg-August-Universität Göttingen. 164 S.
Personal tools
Namespaces

Variants
Actions
Navigation
Development
Toolbox
Print/export