|
Semester Schedule: Risk Seminar - Spring 2013
Seminars are on Tuesdays
Time: 2:40 - 3:55 PM Location: 227 Mudd
|
|
Jan 22
|
Speaker: Professor Jose Garrido
Department Mathematics and Statistics
Concordia University
Title: Risk management for heavy tail losses, black swans and other catastrophes
Abstract:
Risk measures are commonly used now in actuarial and financial risk management alike, for problems such as pricing, reinsurance optimization, capital allocations, portfolio management or credit risk.
With the notable exception of Value at Risk (VaR), most well accepted measures apply only to risks with finite moments. Mathematically this restricts the set of risks random variables to L^p, for some p > = 1, which excludes heavy tailed risks in L0.
Without getting into the subjective choice of what properties are reasonable for a risk measure, we revisit the risk management problem for heavy tailed risks through a personal survey of ideas in functional analysis and convex optimization.
|
|
Jan 29
|
Speaker: Maria Victoria Rivas
Complutense University of Madrid
Title: Application of QCRM (Quality Control of Risk Measure) to ORSA
(Own Risk Solvency Assessment)
Abstract:
EIOPA (European Insurance and Occupational Pensions Authority), NAIC
(National Association of Insurance Commissioners), the Office of the
Superintendent of Financial Institutions Canada (OSFI) and other
regulators are working on a new regulatory requirement called ORSA
(Own Risk Solvency Assessment). ORSA is designed to improve the risk
management, reporting and assessment process, especially in the risk
assessment phase. This presentation focuses on the application of Quality Control of Risk Measure (QCRM) to Own Risk Solvency Assessment. QCRM is a
statistical approach to assess the quality of risk measures. The
approach is applied to the problematic of backtesting for the
insurance companies in the framework of ORSA. Finally, we will analyze
the application of the QCRM approach to real insurance data.
|
|
Feb 5
|
Speaker: Qiang Zeng, University of Illinois Urbana-Champaign
Title: Noncommutative Bernstein's exponential type inequalities and applications to Rosenthal's inequalities, random matrices and compressed sensing
Abstract:
Rosenthal/Burkholder inequalities are the moment estimates for sums of independent or martingale difference sequences. The noncommutative versions were established by Junge--Xu in recent years. As the noncommutative model includes classical random variables and random matrices as special cases, the precise moment estimates are expected to have applications in various areas. In this talk, I will first present an improved noncommutative Rosenthal inequality and derive an invertibility result of random matrices due to Candes--Romberg--Tao in compressed sensing. I will also discuss the corresponding martingale version of Bernstein and Burkholder type inequalities. From here Poincare type inequalities will follow. If time permits, applications to concentration and transportation inequalities will be given as well. No background in noncommutative probability will be assumed for this talk. Joint work with Marius Junge.
|
|
Feb 12
|
Speaker: Professor Jingchen Liu, Columbia University
"Rare-event Analysis and Simulations for Gaussian and Its Related Processes"
Gaussian processes are employed to model spatially varying errors in various stochastic systems. In this talk, we consider the analysis of the extreme behaviors and the rare-event simulation problems for such systems. In particular, the topic covers various nonlinear functionals of Gaussian processes including the supremum norm, integral of convex functions, and stochastic partial differential equations with random coefficients. We present the asymptotic results and the efficient simulation algorithms for the associated rare-event probabilities. |
|
Feb 19
|
Speaker: Jose H. Blanchet, Industrial Engineering and Operations Research,
Columbia University
Title / Abstract:
On Stochastic Insurance and Reinsurance Risk Networks
In the last few years, substantial interest has been given to the study of systemic risk. In this talk we describe a class of models for systemic risk analysis in the setting of insurance and reinsurance participants. The models are constructed with the aim of capturing features such as cascading effects at the time of default due to counter-party risk and contagion. We also impose a probabilistic structure that allows to rigorously study risk analysis questions using the theory of combinatorial optimization. In the end, we are interested in answering questions such as: a) Which group of companies are the most relevant from a systemic risk standpoint?, b) How to quantify the role of reinsurance companies in systemic risk?, c) How to understand and quantify the role of a regulator and associated capital requirements for systemic risk mitigation?
This talk is joint work with Yixi Shi.
|
|
Feb 26
|
Speaker: Sheldon Ross, Department of Industrial and Systems Engineering,
University of Southern California
Title: Queueing Loss Models with Heterogeneous Servers and Discriminating
Arrivals
Abstract: We consider an n serve queueing loss model where customer i service times have distribution Fi. Each arriving customer has a vector (X1, . . . ,Xn) where Xi is the indicator of the event that server i is eligible to serve that customer. The vectors of successive arrivals are assumed to be independent with a common distribution. Arriving customers can be assigned to any currently idle server that is eligible to serve the customer; if there are no such servers then the customer is lost. Assuming that the random vector (X1, . . . ,Xn) is exchangeable, we find the optimal policy that minimizes the proportion of customers that are lost both when Fi, 1 i n are exponential with known rates, and, in the case of Poisson arrivals, when the Fi are general but unknown and only idle-time ordering rules are allowed. One such rule would be to assign an arriving customer to a randomly chosen idle-eligible server; another would be to assign to the idle-eligible server that has been idle the longest; a third is to assign to the idle-eligible server that has been
idle the shortest.
|
|
March 5
|
Speaker: Milan Bradonjic, Bell Labs, Alcatel-Lucent
Title: "Bootstrap Percolation on Random Geometric Graphs"
Abstract:
Bootstrap percolation has been used effectively to model phenomena as
diverse as emergence of magnetism in materials, spread of infection,
diffusion of software viruses in computer networks, adoption of new
technologies, and emergence of collective action and cultural fads in
human societies. It is defined on an (arbitrary) network of
interacting agents whose state is determined by the state of their
neighbors according to a threshold rule. In a typical setting,
bootstrap percolation starts by random and independent ``activation''
of nodes with a fixed probability $p$, followed by a deterministic
process for additional activations based on the density of active
nodes in each neighborhood ($\theta$ activated nodes). Here, we study
bootstrap percolation on random geometric graphs in the regime when
the latter are (almost surely) connected. Random geometric graphs
provide an appropriate model in settings where the neighborhood
structure of each node is determined by geographical distance, as in
wireless ad hoc and sensor networks as well as in contagion. We derive
bounds on the critical thresholds $p_c', p_c''$ such that for all $p >
p''_c(\theta)$ full percolation takes place, whereas for $p <
p'_c(\theta)$ it does not.
Joint work with Iraj Saniee
|
|
March 12
|
Hongzhong Zhang, Department of Statistics, Columbia University
Title: Quickest detection in a system with correlated noise
Abstract: The problem of quickest detection arises in many
applications in quality control and financial surveillance. In this work, we study the quickest detection of signals in a system of 2 sensors coupled by a negatively correlated noise, which receive continuous sequential observations from the environment. It is assumed that the signals are time invariant and with equal strength, but that their onset times may differ from sensor to sensor. The objective is the optimal detection of the first time at which any sensor in the system receives a signal. The problem is formulated as a stochastic optimization problem in which an extended Lorden's criterion is used as a measure of detection delay, with a constraint on the mean time to the first false alarm. The case in which the sensors employ their own cumulative sum (CUSUM)strategies is considered, and it is proved that the minimum of 2 CUSUMs is asymptotically optimal as the mean time to the first false alarm increases without bound.
Implications of this asymptotic optimality result to the efficiency of the decentralized versus the centralized system of observations are further discussed.
|
|
March 26
|
Arian Maleki, Department of Statistics, Columbia University
Title: Phase transitions in compressed sensing
Abstract: Compressed sensing aims to undersample certain
high-dimensional signals yet accurately reconstruct them by exploiting signal structures. Accurate reconstruction is possible when the sparsity of the object to be recovered is below a certain level, a.k.a. phase transition curve. Characterizing the phase transition curve for different reconstruction algorithms is a major problem in compressed sensing. In this talk, I start with a method we developed
based on the analysis of message passing to characterize the phase transition curve for a large class of recovery algorithms. I will then describe the situations in which our analysis fails and propose a method to address such situations. In the end, if time permits, I will also describe several open problems.
|
|
April 9
|
Mark Brown
Department of Mathematics, City College, CUNY
Stein Estimators and their Properties
Stein estimators is one of the many seminal contributions to statistics of Charles Stein. Initially there was controversy as to whether his remarkable results were of practical, rather than simply of mathematical significance. However as they began to be better understood and interpreted it did become clear that the method and underlying philosophy was an important step forward both for data analysis and for statistical theory. Today the ideas are implicit in a wide variety of statistical methodology, even if the actual estimators are not employed in their original form. This talk will go back several decades to review the history, including the important contributions of Efron and Morris, explicitly relating Stein estimation to parametric empirical Bayes estimation, an offshoot of the empirical Bayes approach of Herbert Robbins. My own involvement is as a user of the approach, rather than a contributor to the methodology. If time permits, I’ll discuss how I have found the ideas to be useful in two statistical projects I have been involved with. |
|
*Thursday, April 25
Time: 1:00 -2:00
Location: Room 717 Hamilton
|
Jim Fill
Department of Applied Mathematics and Statistics
The Johns Hopkins University
"Distributional Convergence for the Number of
Symbol Comparisons Used by QuickSort"
Abstract
We will begin by reviewing the operation of the sorting
algorithm QuickSort. Most previous analyses of QuickSort have used the
number of key comparisons as a measure of the cost of executing
the algorithm. In contrast, we suppose that the n independent and
identically distributed (iid) keys are each represented as a
sequence of symbols from a probabilistic source and that QuickSort
operates on individual symbols, and we measure the execution cost
as the number of symbol comparisons. Assuming only a mild
"tameness" condition on the source, we show that there is a
limiting distribution for the number of symbol comparisons after
normalization: first centering by the mean and then dividing by n.
Additionally, under a condition that grows more restrictive as p
increases, we have convergence of moments of orders p and smaller.
In particular, we have convergence in distribution and convergence
of moments of every order whenever the source is memoryless, i.e.,
whenever each key is generated as an infinite string of iid
symbols. This is somewhat surprising: Even for the classical model
that each key is an iid string of unbiased ("fair") bits, the mean
exhibits periodic fluctuations of order n. |