Journal: Risks
Loading...
Abbreviation
Publisher
MDPI
24 results
Search Results
Publications 1 - 10 of 24
- Linear regression for heavy tailsItem type: Journal Article
RisksBalkema, Guus; Embrechts, Paul (2018)There exist several estimators of the regression line in the simple linear regression: Least Squares, Least Absolute Deviation, Right Median, Theil–Sen, Weighted Balance, and Least Trimmed Squares. Their performance for heavy tails is compared below on the basis of a quadratic loss function. The case where the explanatory variable is the inverse of a standard uniform variable and where the error has a Cauchy distribution plays a central role, but heavier and lighter tails are also considered. Tables list the empirical sd and bias for ten batches of one hundred thousand simulations when the explanatory variable has a Pareto distribution and the error has a symmetric Student distribution or a one-sided Pareto distribution for various tail indices. The results in the tables may be used as benchmarks. The sample size is n=100 but results for n=∞ are also presented. The error in the estimate of the slope tneed not be asymptotically normal. For symmetric errors, the symmetric generalized beta prime densities often give a good fit. - Bayesian modelling, Monte Carlo sampling and capital allocation of insurance risksItem type: Journal Article
RisksPeters, Gareth W.; Targino, Rodrigo S.; Wüthrich, Mario V. (2017)The main objective of this work is to develop a detailed step-by-step guide to the development and application of a new class of efficient Monte Carlo methods to solve practically important problems faced by insurers under the new solvency regulations. In particular, a novel Monte Carlo method to calculate capital allocations for a general insurance company is developed, with a focus on coherent capital allocation that is compliant with the Swiss Solvency Test. The data used is based on the balance sheet of a representative stylized company. For each line of business in that company, allocations are calculated for the one-year risk with dependencies based on correlations given by the Swiss Solvency Test. Two different approaches for dealing with parameter uncertainty are discussed and simulation algorithms based on (pseudo-marginal) Sequential Monte Carlo algorithms are described and their efficiency is analysed. - Consistent Re-Calibration of the discrete-time multifactor Vasiček ModelItem type: Journal Article
RisksHarms, Philipp; Stefanovits, David; Teichmann, Josef; et al. (2016)The discrete-time multifactor Vasiček model is a tractable Gaussian spot rate model. Typically, two- or three-factor versions allow one to capture the dependence structure between yields with different times to maturity in an appropriate way. In practice, re-calibration of the model to the prevailing market conditions leads to model parameters that change over time. Therefore, the model parameters should be understood as being time-dependent or even stochastic. Following the consistent re-calibration (CRC) approach, we construct models as concatenations of yield curve increments of Hull–White extended multifactor Vasiček models with different parameters. The CRC approach provides attractive tractable models that preserve the no-arbitrage premise. As a numerical example, we fit Swiss interest rates using CRC multifactor Vasiček models. - Deep arbitrage-free learning in a generalized HJM framework via arbitrage-regularizationItem type: Journal Article
RisksKratsios, Anastasis; Hyndman, Cody (2020)A regularization approach to model selection, within a generalized HJM framework, is introduced, which learns the closest arbitrage-free model to a prespecified factor model. This optimization problem is represented as the limit of a one-parameter family of computationally tractable penalized model selection tasks. General theoretical results are derived and then specialized to affine term-structure models where new types of arbitrage-free machine learning models for the forward-rate curve are estimated numerically and compared to classical short-rate and the dynamic Nelson-Siegel factor models. - Surplus sharing with coherent utility functionsItem type: Journal Article
RisksCoculescu, Delia; Delbaen, Freddy (2019)We use the theory of coherent measures to look at the problem of surplus sharing in an insurance business. The surplus share of an insured is calculated by the surplus premium in the contract. The theory of coherent risk measures and the resulting capital allocation gives a way to divide the surplus between the insured and the capital providers, i.e., the shareholders. - The Impact of Economic Policies on Housing Prices: Approximations and Predictions in the UK, the US, France, and Switzerland from the 1980s to TodayItem type: Journal Article
RisksHoulié, Nicolas (2025)I show that house prices can be modeled using machine learning (kNN and tree-bagging) and a small dataset composed of macroeconomic factors (MEF), including an inflation metric (CPI), US Treasury rates (10-yr), Gross Domestic Product (GDP), and portfolio size of central banks (ECB, FED). This set of parameters covers all the parties involved in a transaction (buyer, seller, and financing facility) while ignoring the intrinsic properties of each asset and encompassing local (inflation) and liquidity issues that may impede each transaction composing a market. The model here takes the point of view of a real estate trader who is interested in both the financing and the price of the transaction. Machine learning allows for the discrimination of two periods within the dataset. First, and up to 2015, I show that, although the US Treasury rates level is the most critical parameter to explain the change of house-price indices, other macroeconomic factors (e.g., consumer price indices) are essential to include in the modeling because they highlight the degree of openness of an economy and the contribution of the economic context to price changes. Second, and for the period from 2015 to today, I show that, to explain the most recent price evolution, it is necessary to include the datasets of the European Central Bank programs, which were designed to support the economy since the beginning of the 2010s. Indeed, unconventional policies of central banks may have allowed some institutional investors to arbitrage between real estate returns and other bond markets (sovereign and corporate). Finally, to assess the models' relative performances, I performed various sensitivity tests, which tend to constrain the possibilities of each approach for each need. I also show that some models can predict the evolution of prices over the next 4 quarters with uncertainties that outperform existing index uncertainties. - Neural networks for the joint development of individual payments and claim incurredItem type: Journal Article
RisksDelong, Łukasz; Wüthrich, Mario V. (2020)The goal of this paper is to develop regression models and postulate distributions which can be used in practice to describe the joint development process of individual claim payments and claim incurred. We apply neural networks to estimate our regression models. As regressors we use the whole claim history of incremental payments and claim incurred, as well as any relevant feature information which is available to describe individual claims and their development characteristics. Our models are calibrated and tested on a real data set, and the results are benchmarked with the Chain-Ladder method. Our analysis focuses on the development of the so-called Reported But Not Settled (RBNS) claims. We show benefits of using deep neural network and the whole claim history in our prediction problem. - Measuring and allocating systemic riskItem type: Journal Article
RisksBrunnermeier, Markus K.; Cheridito, Patrick (2019)In this paper, we develop a framework for measuring, allocating and managing systemic risk. SystRisk, our measure of total systemic risk, captures the a priori cost to society for providing tail-risk insurance to the financial system. Our allocation principle distributes the total systemic risk among individual institutions according to their size-shifted marginal contributions. To describe economic shocks and systemic feedback effects, we propose a reduced form stochastic model that can be calibrated to historical data. We also discuss systemic risk limits, systemic risk charges and a cap and trade system for systemic risk. - Inhomogeneous long-range percolation for real-life network modelingItem type: Journal Article
RisksWüthrich, Mario V.; Deprez, Philippe; Hazra, Rajat S. (2015)The study of random graphs has become very popular for real-life network modeling, such as social networks or financial networks. Inhomogeneous long-range percolation (or scale-free percolation) on the lattice Zd, d ≥ 1, is a particular attractive example of a random graph model because it fulfills several stylized facts of real-life networks. For this model, various geometric properties, such as the percolation behavior, the degree distribution and graph distances, have been analyzed. In the present paper, we complement the picture of graph distances and we prove continuity of the percolation probability in the phase transition point. We also provide an illustration of the model connected to financial networks. - 1980–2008: The Illusion of the Perpetual Money Machine and What It Bodes for the FutureItem type: Journal Article
RisksSornette, Didier; Cauwels, Peter (2014)We argue that the present crisis and stalling economy that have been ongoing since 2007 are rooted in the delusionary belief in policies based on a “perpetual money machine” type of thinking. We document strong evidence that, since the early 1980s, consumption has been increasingly funded by smaller savings, booming financial profits, wealth extracted from house price appreciation and explosive debt. This is in stark contrast with the productivity-fueled growth that was seen in the 1950s and 1960s. We describe the transition, in gestation in the 1970s, towards the regime of the “illusion of the perpetual money machine”, which started at full speed in the early 1980s and developed until 2008. This regime was further supported by a climate of deregulation and a massive growth in financial derivatives designed to spread and diversify the risks globally. The result has been a succession of bubbles and crashes, including the worldwide stock market bubble and great crash of October 1987, the savings and loans crisis of the 1980s, the burst in 1991 of the enormous Japanese real estate and stock market bubbles, the emerging markets bubbles and crashes in 1994 and 1997, the Long-Term Capital Management (LTCM) crisis of 1998, the dotcom bubble bursting in 2000, the recent house price bubbles, the financialization bubble via special investment vehicles, the stock market bubble, the commodity and oil bubbles and the current debt bubble, all developing jointly and feeding on each other until 2008. This situation may be further aggravated in the next decade by an increase in financialization, through exchange-traded-funds (ETFs), speed and automation, through algorithmic trading and public debt, and through growing unfunded liabilities. We conclude that, to get out of this catch 22 situation, we should better manage and understand the incentive structures in our society, we need to focus our efforts on our real economy and we have to respect and master the art of planning and prediction. Only gradual change, with a clear long term planning, can steer our financial and economic system from the turbulence associated with the perpetual money machine to calmer and more sustainable waters.
Publications 1 - 10 of 24