In our recent paper “A Statistical Risk Assessment of Bitcoin and Its Extreme Tail Behaviour” (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2867339), we provide an extreme value analysis of the returns of Bitcoin. A particular focus is on the tail risk characteristics and we will provide an in-depth univariate extreme value analysis. Those properties will be compared to the traditional exchange rates of the G10 currencies versus the US dollar. For investors – especially institutional ones – an understanding of the risk characteristics is of utmost importance. So for bitcoin to become a mainstream investable asset class, studying these properties is necessary. Our findings show that the bitcoin return distribution not only exhibits higher volatility than traditional G10 currencies, but also stronger non-normal characteristics and heavier tails. This has implications for risk management, financial engineering (such as bitcoin derivatives) – both from an investor’s as well as from a regulator’s point of view. To our knowledge, this is the first detailed study looking at the extreme value behaviour of the cryptocurrency Bitcoin.
In our recent paper “The Statistics of Bitcoin and Cryptocurrencies” (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2872158), we show the statistical properties of the most important cryptocurrencies, of which Bitcoin is the most prominent example. We characterize their exchange rates versus the US Dollar by fitting parametric distributions to them. It is shown that returns are clearly non-normal, with standard heavy-tailed distributions giving good descriptions of the data. The results are important for investment and risk management purposes.
The histogram of the Bitcoin/USD exchange rate returns. The normal as well as various heavy-tailed distributions are fitted to the data (from 2014 until 2016).
You can see that the returns are clearly not normal. Once you go to the class of heavy-tailed distributions, the fit becomes reasonably and you can, for simplicity, choose the t-distribution. This result resembles results from traditional fiat currency exchange rates.
In our recent paper “Bitcoin and Cryptocurrencies – Not for the Faint-Hearted” (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2867671), we look at Bitcoin and Cryptocurrencies.
Cryptocurrencies became popular with the emergence of Bitcoin and have shown an unprecedented growth over the last few years. As of November 2016, more than 720 cryptocurrencies exist, with Bitcoin still being the most popular one. We provide both a statistical analysis as well as an extreme value analysis of the returns of the most important cryptocurrencies. A particular focus is on the tail risk characteristics and we will provide an in-depth univariate and multivariate extreme value analysis. The tail dependence of cryptocurrencies is investigated (using empirical copulas). For investors – especially institutional ones – as well as regulators, an understanding of the risk and tail characteristics is of utmost importance. For cryptocurrencies to become a mainstream investable asset class, studying these properties is necessary. Our findings show that cryptocurrencies exhibit strong non-normal characteristics, large tail dependencies, depending on the particular cryptocurrencies and heavy tails. Statistical similarities can be observed for cryptocurrencies that share the same underlying technology. This has implications for risk management, financial engineering (such as derivatives on cryptocurrencies) – both from an investor’s as well as from a regulator’s point of view. To our knowledge, this is the first detailed study looking at the extreme value behaviour of cryptocurrencies, their correlations and tail dependencies as well as their statistical properties.
On September 15, 2016, we will host the «1st European COST Conference on Mathematics for Industry in Switzerland».
We aim to bring European academics, young researchers, students and industrial practitioners together to promote «Mathematics for Industry» in Switzerland. We want to promote «Mathematics for Industry» in Switzerland, similar to what the European Consortium on Mathematics for Industry (ECMI) is doing on an European-wide level.
A number of well-known practicitioners and academics will give a talk:
Michael Aichinger, uni software plus gmbh
Anastasios Badarlis, Endress+Hauser Flowtec AG
Prof. Dr. Joachim Giesen, Universität Jena
Björn Heitmann, Bruker BioSpin AG
Dr. Vivek Kumar, Endress+Hauser Flowtec AG
Roman Mäder, MathConsult AG
Mathematics for Smart Maintenance:
Patrick De Causmaeker, Professor of Computer Science, KU Leuven, Head of CODeS@KULAK
Pierre Dersin, Ph.D., Alstom
Prof. Dr. Diego Galar, Lulea University of Technology
Bob Huisman, Manager Maintenance Development, NedTrain
Francesco Mari, Vice President Product and Innovation Strategy Internet of Things and Customer Innovation, SAP
Prof. Dr. Karl Tuyls, University of Liverpool
Dr. Andreas Binder, CEO MathConsult GmbH
Dr. Ulrich Nögel, Big-xyt, Co-Founder, Analytics
Dr. Olivier Schmid, Fisch Asset Management GmbH
Sandro Schmid, Head of Swiss Risk Association
Dr. Artur Sepp, Director, Quantitative Analytics and Research, Julius Bär
Prof. Dr. Stephan Sturm, Worcester Polytechnic Institute
Dr. Thomas Wiecki, Data Science Lead at Quantopian
Prof. Dr. Uwe Wystup, University of Antwerpen, Managing Director at MathFinance AG
Please register here:
Dr. Jörg Osterrieder, Senior Lecturer Financial Mathematics
Der Grossteil der heutigen Finanzpraxis basiert immer noch auf dem nobelpreisgekrönten Modell von Harry Markowitz aus dem Jahre 1952. Die Suche nach innovativen Ansätzen zur Portfoliokonstruktion hat sich in den letzten Jahren verstärkt, da sich die Markowitz-Optimierung, insbesondere während Finanzkrisen, als fragil herausstellte. Der Trend zur faktorbasierten Asset Allocation (FBAA) versucht diesem Umstand Rechnung zu tragen und könnte in Zukunft die Vermögensverwaltung prägen. Doch wie funktioniert faktorbasiertes Anlegen? Was sind die Vorteile einer FBAA gegenüber dem Standardansatz aus einer strategischen Asset Allocation (SAA) und taktischen Abweichungen (TAA)? Kann eine FBAA die Diversifikation eines Portfolios robuster machen? Ist der Ansatz einer faktorbasierten Asset Allocation nur für institutionelle Kunden geeignet oder können auch Privatanleger davon profitieren?
Am Finance Circle von Anfang Juni beleuchteten drei hochkarätige Experten alle Aspekte. Prof. Dr. Peter Meier , Institut für Wealth & Asset Management (IWA), führte durch den Abend und moderierte auch die Podiumsdiskussion.
Effektive Diversifikation statt Renditemaximierung
Carmine Orlacchio, CIO und Gründungspartner der OLZ & Partners Asset and Liability Management AG, umriss kurz die klassische Portfoliotheorie und betonte, dass eine gute Zusammenarbeit der Finanzindustrie mit den Forschungsinstitutionen für die Überwindung des aktuellen Anlagenotstands von hoher Relevanz ist. «In der aktuellen Situation lassen sich viele Investoren dazu verführen, die Risikokomponente bei der Portfoliokonstruktion zu vernachlässigen», so Orlacchio.
Faktorbasierte Asset Allocation
Dr. Daniel Höchle, Institut für Wealth & Asset Management der ZHAW, stellte den Grundgedanken einer faktorbasierten Asset Allocation vor. Dabei machte er klar, dass faktorbasiertes Anlegen zu einer robusteren Diversifikation führen könne und dass dieser Ansatz sowohl für institutionelle als auch private Anleger geeignet sei sofern diese über einen ausreichend langen Anlagehorizont verfügen. «Wesentlich ist, dass bei der FBAA, anders als beim Standardansatz, das Portfolio-Exposure nicht gegenüber Asset-Klassen, sondern gegenüber Risikofaktoren, welche als Treiber der Assetklassen-Renditen verstanden werden, festgelegt wird», erklärte Daniel Höchle. Da sich die Faktorexposures der Assetklassen im Zeitablauf ändern, führt eine Asset Allocation mit konstanten Positionen gegenüber den Risikofaktoren zu einer dynamischen Anpassung der Assetklassen-Gewichte. «Als Hauptvorteil der FBAA kristallisierte sich deren verfeinerte Steuerung der Portfoliorisiken heraus», fasst Höchle zusammen. «Demgegenüber besticht der Standardansatz durch seine Einfachheit.»
Prof. Dr. Heinz Zimmermann, Wirtschaftswissenschaftliche Fakultät der Universität Basel, analysierte, weshalb der Markowitz-Ansatz so weit verbreitet ist und inwiefern der amerikanische Ökonom die Modellierung der Finanzmärkte vor über sechzig Jahren revolutioniert hatte. Am Beispiel der modernen Portfoliotheorie zeigte er auf, welche Herausforderungen sich stellen, wenn man versucht, Finanzinnovationen in die Praxis umzusetzen. Seine Konklusion war, dass der Ansatz von Markowitz natürlich nicht überholt ist, dass eine erfolgreiche Umsetzung des Modells in der Praxis jedoch clevere Ansätze zur Schätzung der Input-Parameter erfordert – die Extrapolation von historischen Daten funktioniert nicht.
Für weitere Informationen zur faktorbasierten Asset Allocation: Dr. Daniel Höchle, Institut für Wealth & Asset Management
Author: Wolfgang Breymann
Fintech or, as I prefer to call it, the industrialization of the financial sector is a worldwide transformation of the financial sector with the goal to make the latest developments in ICT and data analysis available for the financial sector. It is part of the even broader process of digitization of the word and encompasses not only the development of new products and new business models but, perhaps even more profoundly, the way future financial institution will operate; it can be viewed as the industrialization of the financial world.
Automation is an integral part of industrialization. Thus, we should expect the (highly) automated bank to emerge from this industrialization process similar to automated manufacturing plant, which are mundane in the car industry. While manufacturing plants are dealing with physical things, banks are essentially dealing with data using computers. A bank can also be considered as an IT company applied to the financial sector. Still, in spite of using computers heavily, the financial industry is only just in the transition process from the pre-industrial era into the industrial one. The reason is the lack of automation and one of the impediments are missing data standards; or – more precisely – data standards that are not appropriately adapted to the use case.
Since fall 2012 my team is participating in the project ACTUS, which is the acronym for Algorithmic Contract Type Unifying Standard. The original goal of the project was to develop a data and modeling standard of financial contracts in view of financial analysis and risk assessment. This sounds (and is) quite technical and at the same time it is at the core of the functionality of a bank, namely the assessment of future financing needs and the development of the future values of a bank’s overall positions. It is important that this task can be carried out consistently, quickly and in a transparent way. Unbelievably for the non-expert, this is by far not the case today: If today the regulator demands the banks to carry out a stress test, it takes weeks if not months if the results are available, and they are not even comparable among different banks. However, it is not per se infeasible if one uses the right concepts and technology. Typically it heavily relies on simulation, which, besides experiment and theory, is a standard approach in science for understanding complex systems and establishing effective control. If based on “first principles”, simulation of a complex system needs to build on granular data, i.e. it must start at the level of the system’s basic building blocks. Business organizations and especially financial institutions can be understood just as such complex systems. Simulating their business on a granular level is a complex computational task similar to weather forecasting or other large-volume data-processing tasks. However, while the meteorological infrastructure evolved over decades in a collective effort such that today, local, regional and even global weather forecasts are common standard, financial institutions lack the risk assessment infrastructures necessary for similarly frequent and consistent risk assessments at the different levels of the system.
Partially, the lack of appropriate risk-management capabilities in the financial industry can be attributed to the fact that risk management could not keep pace with financial innovation. With the advent of financial mathematics new and highly complex financial instruments have been engineered over the course of the last two decades. On the other hand, risk management failed to take full advantage of the developments in modern ICT and still relies on (often ad-hoc and not properly systematized) analytical shortcuts and traditional technology such as Excel, which is not properly maintainable. ACTUS is providing a centerpiece to change this situation.
Atomic building blocks
The basic (quasi atomic) building blocks of a bank’s balance sheet are the financial contracts (also called financial instruments or assets). They encompass the whole financial universe reaching from widely known securities as stocks and bonds to complex derivatives. The balance sheet of even mid-size banks consists of millions of contracts. It is due to their heterogeneity as well as the fact that different risk factor categories must be treated differently as to financial valuation that consistent risk assessment is still a big challenge. Consistent risk assessment requires Monte Carlo (MC) simulations for all but the most basic derivatives to properly take into account the effect of the future price fluctuations of the underlying instruments on the derivative price.
Important in the following is that, going back to first principles, the ingredients for all kinds of financial analysis such as the valuation of a financial contract are the future expected cash flows together with the adequate discount factors. This means that the future cash flows must be evaluated at the level of the individual contracts (i.e., the granular level). This requires large IT resources as to both storage and CPU, in particular if future uncertainty is taken into account appropriately through a multiplicity of risk scenarios.
In its conception, ACTUS follows the book “Unified Financial Analysis” of Brammertz et al. (2009), which is co-authored by the author of this contribution. ACTUS supports an analytical process that can be organized in form of a data supply chain as depicted in the following figure.
The main parts are the following.
- The input elements consisting of financial contracts and risk factors.
- Financial contracts play the central role in this model. They consist of contract data and algorithms. The contract algorithms encode the legal contract rules important for cash flow generation (who is paying how much to whom under which circumstances) while the contract data provide the parameters necessary for the full contract specifi-cation.
- Risk factors determine the state of the financial and economic environment. They are further divided into factors for market risk, for counterparty risk and for all the remaining risk factors lumped together in a third catch-all category called “Behavior”. The important property of risk factors is that their future state is unknown. The most important market risk factors are interest rates, foreign exchange rates, stock indices and commodity indices. Counter party risk factors typically consist of credit ratings and/or default probabilities.
In order to generate the cash flows encoded in a contract, both contract data and risk factor information is needed. The reason is that the contract rules may refer to market information such as interest rates in the case of a variable rate bond. Notice that the separation of risk factors and contracts is important because it separates the known from the unknown: the contract rules are deterministic (known) while the future development of risk factors is unknown, they have random components. The future development of the state of a contract is completely determined for a given risk factor scenario, i.e., an assumed future development of their values.
- The raw results are cash flow streams together with some auxiliary information obtained as output of the contract algorithms. Assuming n contracts and k risk factor scenarios, there will be n x k cash flow streams consisting of 20 to 50 events each. Since there are millions of contracts on a bank’s balance sheet and a MC simulation does contain up to 10’000 risk scenarios or even more the size of the data can easily be of the order of Terabytes for large institutions and Petabytes for the whole financial system. This requires the use of Big Data technologies. Test in this direction are currently under way.
- The different types of financial analysis such as liquidity and solvency calculations are carried out on top of the raw results. This encompasses income analysis, sensitivity analysis and different kind of risk measures. Important is the possibility to flexibly aggregate according to different criteria.
In addition to ACTUS there is another – complementary – industry initiative that aims at providing the financial industry with the framework necessary to support financial analysis on a granular level. The Global Legal Entity Identifier (GLEI), which is already in its deployment phase, provides a worldwide system for assigning unique identifiers to any entity that is the counterparty to a financial contract. Taken together with ACTUS, which aims at creating a standard machine-readable algorithmic representation of the contingent cash-flow obligations embedded in financial contracts, these two foundational initiatives provide the necessary granular data basis for consistent financial risk management.
From the software engineering point of view ACTUS is a library of Java routines that is in the process of being released as open source by the ACTUS Users Association (AUA). It cannot be used stand-alone but must be embedded in a suitable environment that adds risk factors models, manages data input and output and carries out the analytics based on the cash flow results. AUA is in the process of promoting ACTUS to be integrated by financial institutions and third party software vendors. Currently there are three ways to use ACTUS: One for pure demonstration purposes, one for prototyping purposes and one for deployment as operational system in financial institutions.
The first one, provided by the Contract Type Calculator on the ACTUS website, gives the user the possibility to enter data into a web form, provides some choices for simple risk factor models, carries out the computation and presents the results in graphical and tabular form. The website also contains more detailed information of the contracts.
The second way consists of a number of R packages bundled under the name of Risk & Finance Lab (RFL). They provide access to the ACTUS contract types and support financial modeling and analysis. We used them to carry out an ACTUS proof of concept with about 4000 real bond data, cf. Breymann et al. (2016). The package will be available soon on the CRAN server.
The 3rd way of using ACTUS is through the Ariadne Risk Management Platform, which is a professional risk management platform for financial and non-financial institutions newly developed on the basis of ACTUS.
RFL and Ariadne as well as open challenges such as using ACTUS with Big-Data technology and extending ACTUS contract types into full-fledged smart contracts by attaching them to a public ledger (blockchain technology) will be the subject of future blog contributions.
W. Brammertz, I. Akkizidis, W. Breymann, R. Entin and M. Rüstmann, Unified Financial Analysis. Chichester, 2009.
Breymann, N. Bundi, J. Micheler, and K. Stockinger, Large-Scale Data-Driven Financial Risk Assessment. In preparation, 2016