Skip to contents

Fits a generalized additive model (GAM) to continuous risk factors in one of the following three types of models: the number of reported claims (claim frequency), the severity of reported claims (claim severity) or the burning cost (i.e. risk premium or pure premium).

Usage

fit_gam(
  data,
  nclaims,
  x,
  exposure,
  amount = NULL,
  pure_premium = NULL,
  model = "frequency",
  round_x = NULL
)

Arguments

data

data.frame of an insurance portfolio

nclaims

column in data with number of claims

x

column in data with continuous risk factor

exposure

column in data with exposure

amount

column in data with claim amount

pure_premium

column in data with pure premium

model

choose either 'frequency', 'severity' or 'burning' (model = 'frequency' is default). See details section.

round_x

round elements in column x to multiple of round_x. This gives a speed enhancement for data containing many levels for x.

Value

A list with components

prediction

data frame with predicted values

x

name of continuous risk factor

model

either 'frequency', 'severity' or 'burning'

data

data frame with predicted values and observed values

x_obs

observations for continuous risk factor

Details

The 'frequency' specification uses a Poisson GAM for fitting the number of claims. The logarithm of the exposure is included as an offset, such that the expected number of claims is proportional to the exposure.

The 'severity' specification uses a lognormal GAM for fitting the average cost of a claim. The average cost of a claim is defined as the ratio of the claim amount and the number of claims. The number of claims is included as a weight.

The 'burning' specification uses a lognormal GAM for fitting the pure premium of a claim. The pure premium is obtained by multiplying the estimated frequency and the estimated severity of claims. The word burning cost is used here as equivalent of risk premium and pure premium. Note that the functionality for fitting a GAM for pure premium is still experimental (in the early stages of development).

References

Antonio, K. and Valdez, E. A. (2012). Statistical concepts of a priori and a posteriori risk classification in insurance. Advances in Statistical Analysis, 96(2):187–224. doi:10.1007/s10182-011-0152-7.

Grubinger, T., Zeileis, A., and Pfeiffer, K.-P. (2014). evtree: Evolutionary learning of globally optimal classification and regression trees in R. Journal of Statistical Software, 61(1):1–29. doi:10.18637/jss.v061.i01.

Henckaerts, R., Antonio, K., Clijsters, M. and Verbelen, R. (2018). A data driven binning strategy for the construction of insurance tariff classes. Scandinavian Actuarial Journal, 2018:8, 681-705. doi:10.1080/03461238.2018.1429300.

Wood, S.N. (2011). Fast stable restricted maximum likelihood and marginal likelihood estimation of semiparametric generalized linear models. Journal of the Royal Statistical Society (B) 73(1):3-36. doi:10.1111/j.1467-9868.2010.00749.x.

Author

Martin Haringa

Examples

fit_gam(MTPL, nclaims = nclaims, x = age_policyholder,
exposure = exposure)
#>            x  predicted     lwr_95    upr_95
#> 1   18.00000 0.33683705 0.21781040 0.5209081
#> 2   18.77778 0.32218997 0.21698662 0.4783999
#> 3   19.55556 0.30817998 0.21568460 0.4403416
#> 4   20.33333 0.29478165 0.21372588 0.4065779
#> 5   21.11111 0.28197418 0.21093250 0.3769426
#> 6   21.88889 0.26974205 0.20717072 0.3512117
#> 7   22.66667 0.25807532 0.20239831 0.3290683
#> 8   23.44444 0.24696930 0.19669877 0.3100875
#> 9   24.22222 0.23642372 0.19028011 0.2937573
#> 10  25.00000 0.22644165 0.18343007 0.2795388
#> 11  25.77778 0.21702844 0.17644700 0.2669433
#> 12  26.55556 0.20819044 0.16958063 0.2555908
#> 13  27.33333 0.19993360 0.16300468 0.2452288
#> 14  28.11111 0.19226234 0.15681941 0.2357158
#> 15  28.88889 0.18517864 0.15107034 0.2269878
#> 16  29.66667 0.17868117 0.14577063 0.2190219
#> 17  30.44444 0.17276468 0.14091962 0.2118061
#> 18  31.22222 0.16741962 0.13651454 0.2053212
#> 19  32.00000 0.16263204 0.13255476 0.1995340
#> 20  32.77778 0.15838349 0.12904044 0.1943990
#> 21  33.55556 0.15465092 0.12596792 0.1898651
#> 22  34.33333 0.15140695 0.12332514 0.1858831
#> 23  35.11111 0.14862005 0.12108853 0.1824113
#> 24  35.88889 0.14625487 0.11922234 0.1794168
#> 25  36.66667 0.14427229 0.11768048 0.1768730
#> 26  37.44444 0.14262987 0.11641068 0.1747544
#> 27  38.22222 0.14128239 0.11535995 0.1730298
#> 28  39.00000 0.14018239 0.11447950 0.1716561
#> 29  39.77778 0.13928082 0.11372806 0.1705748
#> 30  40.55556 0.13852786 0.11307267 0.1697136
#> 31  41.33333 0.13787411 0.11248705 0.1689907
#> 32  42.11111 0.13727167 0.11194807 0.1683237
#> 33  42.88889 0.13667536 0.11143197 0.1676373
#> 34  43.66667 0.13604407 0.11091147 0.1668717
#> 35  44.44444 0.13534205 0.11035510 0.1659866
#> 36  45.22222 0.13453992 0.10972876 0.1649612
#> 37  46.00000 0.13361546 0.10899937 0.1637908
#> 38  46.77778 0.13255422 0.10813962 0.1624809
#> 39  47.55556 0.13134973 0.10713267 0.1610410
#> 40  48.33333 0.13000323 0.10597508 0.1594794
#> 41  49.11111 0.12852296 0.10467710 0.1578010
#> 42  49.88889 0.12692325 0.10326012 0.1560090
#> 43  50.66667 0.12522346 0.10175212 0.1541090
#> 44  51.44444 0.12344655 0.10018246 0.1521130
#> 45  52.22222 0.12161762 0.09857760 0.1500427
#> 46  53.00000 0.11976256 0.09695867 0.1479297
#> 47  53.77778 0.11790690 0.09534118 0.1458135
#> 48  54.55556 0.11607463 0.09373643 0.1437362
#> 49  55.33333 0.11428738 0.09215413 0.1417365
#> 50  56.11111 0.11256383 0.09060508 0.1398444
#> 51  56.88889 0.11091931 0.08910266 0.1380777
#> 52  57.66667 0.10936563 0.08766282 0.1364414
#> 53  58.44444 0.10791109 0.08630242 0.1349302
#> 54  59.22222 0.10656065 0.08503682 0.1335324
#> 55  60.00000 0.10531631 0.08387718 0.1322353
#> 56  60.77778 0.10417745 0.08282849 0.1310291
#> 57  61.55556 0.10314121 0.08188883 0.1299092
#> 58  62.33333 0.10220313 0.08105032 0.1288765
#> 59  63.11111 0.10135754 0.08030133 0.1279350
#> 60  63.88889 0.10059811 0.07962946 0.1270884
#> 61  64.66667 0.09991824 0.07902461 0.1263360
#> 62  65.44444 0.09931152 0.07848142 0.1256702
#> 63  66.22222 0.09877204 0.07800040 0.1250752
#> 64  67.00000 0.09829468 0.07758724 0.1245288
#> 65  67.77778 0.09787524 0.07725078 0.1240060
#> 66  68.55556 0.09751063 0.07700007 0.1234846
#> 67  69.33333 0.09719881 0.07684142 0.1229494
#> 68  70.11111 0.09693873 0.07677604 0.1223965
#> 69  70.88889 0.09673020 0.07679874 0.1218344
#> 70  71.66667 0.09657374 0.07689774 0.1212843
#> 71  72.44444 0.09647024 0.07705556 0.1207766
#> 72  73.22222 0.09642076 0.07725076 0.1203478
#> 73  74.00000 0.09642621 0.07745955 0.1200370
#> 74  74.77778 0.09648712 0.07765661 0.1198838
#> 75  75.55556 0.09660332 0.07781429 0.1199292
#> 76  76.33333 0.09677375 0.07790047 0.1202195
#> 77  77.11111 0.09699627 0.07787574 0.1208114
#> 78  77.88889 0.09726760 0.07769195 0.1217756
#> 79  78.66667 0.09758320 0.07729414 0.1231980
#> 80  79.44444 0.09793730 0.07662748 0.1251733
#> 81  80.22222 0.09832305 0.07564773 0.1277953
#> 82  81.00000 0.09873263 0.07433108 0.1311448
#> 83  81.77778 0.09915751 0.07267893 0.1352829
#> 84  82.55556 0.09958874 0.07071599 0.1402500
#> 85  83.33333 0.10001728 0.06848345 0.1460711
#> 86  84.11111 0.10043439 0.06603031 0.1527642
#> 87  84.88889 0.10083198 0.06340585 0.1603494
#> 88  85.66667 0.10120300 0.06065455 0.1688587
#> 89  86.44444 0.10154181 0.05781385 0.1783438
#> 90  87.22222 0.10184440 0.05491416 0.1888817
#> 91  88.00000 0.10210858 0.05198048 0.2005784
#> 92  88.77778 0.10233412 0.04903472 0.2135685
#> 93  89.55556 0.10252277 0.04609792 0.2280129
#> 94  90.33333 0.10267806 0.04319190 0.2440917
#> 95  91.11111 0.10280433 0.04033996 0.2619916
#> 96  91.88889 0.10290626 0.03756567 0.2818982
#> 97  92.66667 0.10298930 0.03489104 0.3039977
#> 98  93.44444 0.10305920 0.03233509 0.3284728
#> 99  94.22222 0.10312138 0.02991252 0.3555039
#> 100 95.00000 0.10318029 0.02763269 0.3852745