Skip to contents

Fits a generalized additive model (GAM) to a continuous risk factor in one of three contexts: claim frequency, claim severity, or burning cost (pure premium).

Usage

fit_gam(
  data,
  nclaims,
  x,
  exposure,
  amount = NULL,
  pure_premium = NULL,
  model = "frequency",
  round_x = NULL
)

Arguments

data

A data.frame containing the insurance portfolio.

nclaims

Column in data with the number of claims.

x

Column in data with the continuous risk factor.

exposure

Column in data with the exposure.

amount

(Optional) Column in data with the claim amount. Required for model = "severity".

pure_premium

(Optional) Column in data with the pure premium. Required for model = "burning".

model

Character string specifying the model type. One of "frequency", "severity", or "burning". Default is "frequency".

round_x

(Optional) Numeric value to round the risk factor x to a multiple of round_x. Can speed up fitting for factors with many levels.

Value

A list of class "fitgam" with the following elements:

prediction

A data frame with predicted values and confidence intervals.

x

Name of the continuous risk factor.

model

The model type: "frequency", "severity", or "burning".

data

Merged data frame with predictions and observed values.

x_obs

Observed values of the continuous risk factor.

Details

  • Frequency model: Fits a Poisson GAM to the number of claims. The log of the exposure is used as an offset so the expected number of claims is proportional to exposure.

  • Severity model: Fits a lognormal GAM to the average claim size (total amount divided by number of claims). The number of claims is included as a weight.

  • Burning cost model: Fits a lognormal GAM to the pure premium (risk premium). Implemented by aggregating exposure-weighted pure premiums. This functionality is still experimental.

References

Antonio, K. and Valdez, E. A. (2012). Statistical concepts of a priori and a posteriori risk classification in insurance. Advances in Statistical Analysis, 96(2):187–224. doi:10.1007/s10182-011-0152-7 .

Grubinger, T., Zeileis, A., and Pfeiffer, K.-P. (2014). evtree: Evolutionary learning of globally optimal classification and regression trees in R. Journal of Statistical Software, 61(1):1–29. doi:10.18637/jss.v061.i01 .

Henckaerts, R., Antonio, K., Clijsters, M. and Verbelen, R. (2018). A data driven binning strategy for the construction of insurance tariff classes. Scandinavian Actuarial Journal, 2018:8, 681-705. doi:10.1080/03461238.2018.1429300 .

Wood, S.N. (2011). Fast stable restricted maximum likelihood and marginal likelihood estimation of semiparametric generalized linear models. Journal of the Royal Statistical Society (B) 73(1):3-36. doi:10.1111/j.1467-9868.2010.00749.x .

Author

Martin Haringa

Examples

fit_gam(MTPL, nclaims = nclaims, x = age_policyholder,
exposure = exposure)
#>            x  predicted     lwr_95    upr_95
#> 1   18.00000 0.33683705 0.27271478 0.4160361
#> 2   18.77778 0.32218997 0.26666775 0.3892723
#> 3   19.55556 0.30817998 0.26055682 0.3645075
#> 4   20.33333 0.29478165 0.25429712 0.3417114
#> 5   21.11111 0.28197418 0.24780150 0.3208594
#> 6   21.88889 0.26974205 0.24099674 0.3019160
#> 7   22.66667 0.25807532 0.23384114 0.2848210
#> 8   23.44444 0.24696930 0.22634236 0.2694760
#> 9   24.22222 0.23642372 0.21857031 0.2557354
#> 10  25.00000 0.22644165 0.21065585 0.2434104
#> 11  25.77778 0.21702844 0.20276755 0.2322923
#> 12  26.55556 0.20819044 0.19507340 0.2221895
#> 13  27.33333 0.19993360 0.18770756 0.2129560
#> 14  28.11111 0.19226234 0.18075695 0.2045001
#> 15  28.88889 0.18517864 0.17426654 0.1967740
#> 16  29.66667 0.17868117 0.16825391 0.1897546
#> 17  30.44444 0.17276468 0.16272430 0.1834246
#> 18  31.22222 0.16741962 0.15768093 0.1777598
#> 19  32.00000 0.16263204 0.15312862 0.1727252
#> 20  32.77778 0.15838349 0.14907174 0.1682769
#> 21  33.55556 0.15465092 0.14550910 0.1643671
#> 22  34.33333 0.15140695 0.14242934 0.1609504
#> 23  35.11111 0.14862005 0.13980831 0.1579872
#> 24  35.88889 0.14625487 0.13760872 0.1554443
#> 25  36.66667 0.14427229 0.13578209 0.1532934
#> 26  37.44444 0.14262987 0.13427277 0.1515071
#> 27  38.22222 0.14128239 0.13302337 0.1500542
#> 28  39.00000 0.14018239 0.13197971 0.1488949
#> 29  39.77778 0.13928082 0.13109440 0.1479785
#> 30  40.55556 0.13852786 0.13032771 0.1472440
#> 31  41.33333 0.13787411 0.12964606 0.1466244
#> 32  42.11111 0.13727167 0.12901868 0.1460526
#> 33  42.88889 0.13667536 0.12841436 0.1454678
#> 34  43.66667 0.13604407 0.12779918 0.1448209
#> 35  44.44444 0.13534205 0.12713639 0.1440773
#> 36  45.22222 0.13453992 0.12638825 0.1432173
#> 37  46.00000 0.13361546 0.12551954 0.1422336
#> 38  46.77778 0.13255422 0.12450226 0.1411269
#> 39  47.55556 0.13134973 0.12332016 0.1399021
#> 40  48.33333 0.13000323 0.12197156 0.1385638
#> 41  49.11111 0.12852296 0.12046921 0.1371151
#> 42  49.88889 0.12692325 0.11883738 0.1355593
#> 43  50.66667 0.12522346 0.11710699 0.1339025
#> 44  51.44444 0.12344655 0.11531038 0.1321568
#> 45  52.22222 0.12161762 0.11347729 0.1303419
#> 46  53.00000 0.11976256 0.11163268 0.1284845
#> 47  53.77778 0.11790690 0.10979641 0.1266165
#> 48  54.55556 0.11607463 0.10798423 0.1247712
#> 49  55.33333 0.11428738 0.10620991 0.1229792
#> 50  56.11111 0.11256383 0.10448718 0.1212648
#> 51  56.88889 0.11091931 0.10283084 0.1196440
#> 52  57.66667 0.10936563 0.10125650 0.1181242
#> 53  58.44444 0.10791109 0.09977911 0.1167058
#> 54  59.22222 0.10656065 0.09841094 0.1153853
#> 55  60.00000 0.10531631 0.09715961 0.1141578
#> 56  60.77778 0.10417745 0.09602657 0.1130202
#> 57  61.55556 0.10314121 0.09500671 0.1119722
#> 58  62.33333 0.10220313 0.09408924 0.1110167
#> 59  63.11111 0.10135754 0.09325957 0.1101587
#> 60  63.88889 0.10059811 0.09250190 0.1094029
#> 61  64.66667 0.09991824 0.09180188 0.1087522
#> 62  65.44444 0.09931152 0.09114882 0.1082052
#> 63  66.22222 0.09877204 0.09053670 0.1077565
#> 64  67.00000 0.09829468 0.08996360 0.1073973
#> 65  67.77778 0.09787524 0.08942990 0.1071181
#> 66  68.55556 0.09751063 0.08893583 0.1069122
#> 67  69.33333 0.09719881 0.08847925 0.1067777
#> 68  70.11111 0.09693873 0.08805431 0.1067196
#> 69  70.88889 0.09673020 0.08765123 0.1067496
#> 70  71.66667 0.09657374 0.08725735 0.1068848
#> 71  72.44444 0.09647024 0.08685919 0.1071447
#> 72  73.22222 0.09642076 0.08644476 0.1075480
#> 73  74.00000 0.09642621 0.08600526 0.1081098
#> 74  74.77778 0.09648712 0.08553551 0.1088409
#> 75  75.55556 0.09660332 0.08503279 0.1097483
#> 76  76.33333 0.09677375 0.08449488 0.1108370
#> 77  77.11111 0.09699627 0.08391784 0.1121129
#> 78  77.88889 0.09726760 0.08329462 0.1135846
#> 79  78.66667 0.09758320 0.08261463 0.1152639
#> 80  79.44444 0.09793730 0.08186475 0.1171654
#> 81  80.22222 0.09832305 0.08103130 0.1193048
#> 82  81.00000 0.09873263 0.08010219 0.1216962
#> 83  81.77778 0.09915751 0.07906875 0.1243502
#> 84  82.55556 0.09958874 0.07792679 0.1272722
#> 85  83.33333 0.10001728 0.07667653 0.1304631
#> 86  84.11111 0.10043439 0.07532179 0.1339196
#> 87  84.88889 0.10083198 0.07386860 0.1376375
#> 88  85.66667 0.10120300 0.07232385 0.1416137
#> 89  86.44444 0.10154181 0.07069430 0.1458497
#> 90  87.22222 0.10184440 0.06898618 0.1503530
#> 91  88.00000 0.10210858 0.06720538 0.1551388
#> 92  88.77778 0.10233412 0.06535820 0.1602289
#> 93  89.55556 0.10252277 0.06345225 0.1656508
#> 94  90.33333 0.10267806 0.06149734 0.1714348
#> 95  91.11111 0.10280433 0.05950583 0.1776083
#> 96  91.88889 0.10290626 0.05749221 0.1841936
#> 97  92.66667 0.10298930 0.05547226 0.1912090
#> 98  93.44444 0.10305920 0.05346204 0.1986681
#> 99  94.22222 0.10312138 0.05147658 0.2065797
#> 100 95.00000 0.10318029 0.04952858 0.2149501