I took a course on Statistical Modeling of Extreme Values in university. The following results may have been covered elsewhere, but it was missing from the course I took. I think that's a shame because they lend themselves more easily to quick estimations than the GEV. When valuating insurance it may of course be important to use the formal tools, but for most cases in extreme value analysis we just want to know whether something has the chance to be Really Bad.
Suppose we pick N samples from a stochastic variable X that are independent and identically distributed. What is the expected value of the Max of these samples? In particular, how does it grow with respect to N?
1. X ~ U(0, 1)
Another way to see this is that the expected value of 1 - Max(X) shrinks with the speed of O(1/N). So with N=10, the max will be about 0.9, with N=100, the max will be about 0.99, and so on...
2. X ~ Exp(1)
At this point, I had to take help from Wolfram Alpha [1]. The expression evaluates to the Digamma function of N, plus a constant [2]. The constant is about 0.58. The Digamma function grows as ln(N) - 1/(2N). Calculating some values:
N | E(Max({X})) |
---|---|
10 | 2.8 |
100 | 5.2 |
1000 | 7.5 |
1,000,000 | 14.4 |
Basically, the expected max grows logarithmically. Every extra order of magnitude adds about 2.3 (ln(10)) to the expected max value.
3. X ~ Pow(a), a > 1
N | E(Max({X})), a=2 | E(Max({X})), a=3 |
---|---|---|
10 | 3.16 | 2.15 |
100 | 10 | 4.64 |
1000 | 31.6 | 10 |
1,000,000 | 1000 | 100 |
References
[1] Wolfram Alpha input
[2] Digamma function (wikipedia)
[3] Wolfram Alpha input
[4] Why, do you ask, did I not just use WA on the original expression? Actually, computation time in the free version times out then.
Inga kommentarer:
Skicka en kommentar