Information détaillée concernant le cours
Titre | Ecole d'Hiver 2012 |
||||||||||||||||||||||||||||||||||||||||||||||||||
ID CUSO | 12220001 |
||||||||||||||||||||||||||||||||||||||||||||||||||
Dates | 5 au 8 février 2012 |
||||||||||||||||||||||||||||||||||||||||||||||||||
Organisateur(s)/trice(s) | |||||||||||||||||||||||||||||||||||||||||||||||||||
Intervenant-e-s | Professeurs R.A. Davis (USA), B.P. Carlin (USA) et N.W. Hengartner (USA) |
||||||||||||||||||||||||||||||||||||||||||||||||||
Description | Professeur Richard A. Davis, du département de statistique de l’Université de ColumbiaTitle: Two Topics in Time Series Modeling Lectures 1 and 2: Detection of Structural Breaks and Outliers in Time Series Often, time series data exhibit nonstationarity in which segments look stationary, but the whole ensemble is nonstationary. In the first 2 lectures, we will consider the problem of modeling a class of non-stationary time series with outliers using piecewise autoregressive (AR) processes. The number and locations of the piecewise autoregressive segments, as well as the orders of the respective AR processes, are assumed to be unknown and each piece may be contaminated with an unknown number of innovational and/or additive outliers. The minimum description length principle is applied to compare various segmented AR fits to the data. The goal is to find the “best” combination of the number of segments, the lengths of the segments, the orders of the piecewise AR processes, the number and type of outliers. Such a “best” combination is implicitly defined as the optimizer of a MDL criterion. Since the optimization is carried over a large number of configurations of segments and positions of outliers, a genetic algorithm is used to find optimal or near optimal solutions. Strategies for accelerating the procedure will also be described. Numerical results from simulation experiments and real data analyses show that the procedure enjoys excellent empirical properties. The theory behind this procedure will also be discussed.(This is joint work with Thomas Lee and Gabriel Rodriguez-Yam.) Lecture 3: Estimating Extremal Dependence in Time Series via the Extremogram The extremogram is a flexible quantitative tool that measures various types of extremal dependence in a stationary time series. In many respects, the extremogram can be viewed as an extreme-value analogue of the autocorrelation function (ACF) for a time series. Under mixing conditions, the asymptotic normality of the empirical extremogram was derived in Davis and Mikosch (2009). Unfortunately, the limiting variance is a difficult quantity to estimate. Instead we employ the stationary bootstrap to the empirical extremogram and establish that this resampling procedure provides an asymptotically correct approximation to the central limit theorem. This in turn can be used for constructing credible confidence bounds for the sample extremogram. The use of the stationary bootstrap for the extremogram is illustrated in a variety of real and simulated data sets. The cross-extremogram measures cross-sectional extremal dependence in multivariate time series. A measure of this dependence, especially the left tail dependence, is of great importance in the calculation of portfolio risk. We find that after devolatilizing the marginal series, extremal dependence still remains, which suggests that the extremal dependence is not due solely to the heteroskedasticity in the stock returns process. However, for the univariate series, the filtering removes all extremal dependence. Following Geman and Chang (2010), a return time extremogram which measures the waiting time between rare or extreme events in univariate and bivariate stationary time series is calculated. The return time extremogram suggests the existence of extremal clustering in the return times of extreme events for financial assets. The stationary bootstrap can again provide an asymptotically correct approximation to the central limit theorem and can be used for constructing credible confidence bounds for this return time extremogram. Professeur Bradley P. Carlin, biostatisticien de l’Université de MinnesotaTitle: Bayesian Adaptive Methods for Clinical Trials
Professeur Nicolas W. Hengartner, du Laboratoire National de Los AlamosTitle: Statistics in action: statistical modeling in the physical sciences Overview: Statistical modeling is an integral part of physical sciences, where it is used to model observations, estimate parameters and help assess the uncertainty of predictions. Developing these statistical models requires an inter-disciplinary approach that combines subject matter knowledge with mathematics and statistics. In this set of lectures, I present three example of statistical modeling in the physical sciences that represent the wide range of modeling encountered at the Los Alamos National Laboratory. Lecture 1: Modeling correlated neutrons. A fissile material is capable of sustaining a chain reaction of nuclear ssion, a nuclear reaction that splits the nucleus of an atom into smaller parts and releases neutrons and photons. The distribution of the number of neutrons produced by a fission chain reaction characterizes the type and amount of the fissile material, and is therefore useful for radiological asseys. This first talk discusses estimating that distribution from the observed point process of detection times of individual neutrons. A description of that point process requires modeling both the process of generating neutrons using a modied branching process, and the process of detecting the neutrons using He3 based detectors which introduces random delays. This makes estimating the distribution of chain sizes challenging. We propose to estimate the chain size distribution by using a method of moments that involves the probability generating functional of the point process. Lecture 2: Modeling cosmic muon scattering for passive imaging of high-Z materials. Soft cosmic rays are sub-atomic particles that collide with the upper atmosphere to produce pions that decay to muons, electrons and positrons. Most muons reach earth's crust where they penetrate into matter, where they interact through multiple Coulomb scattering. This makes it possible to use them as probes for tomographic imaging. This second talk discusses the modeling and statistical tools required for tomographic imaging. This problem has several interesting features: First, scattering is random, with the signal being in the variance instead of the mean. Second, it is possible to investigate a priori what features of the image can be easily reconstructed. Thus while this is an inverse problem, we can identify the "well posed questions in this ill posed problem". Lecture 3: Statistical modeling of computer codes. Many scientic phenomena are too complex to be modeled analytically (think of weather predictions for example) but remain amenable to numerical solutions using computer codes. These codes often depend on many input parameters that we wish to estimate from data. In this talk, I will discuss an approach to this problem, based on the idea of building an emulator (a surrogate) for the computationally expensive computer code. Issues with the current state-of-the-art will be discussed.
PROGRAMME:
|
||||||||||||||||||||||||||||||||||||||||||||||||||
Lieu |
Les Diablerets |
||||||||||||||||||||||||||||||||||||||||||||||||||
Information | Eurotel Victoria
|
||||||||||||||||||||||||||||||||||||||||||||||||||
Frais | Doctorant CUSO chambre double: 150 CHF Doctorant CUSO chambre simple: 300 CHF Post-doctorant CUSO chambre double: 250 CHF Post-doctorant CUSO chambre simple: 400 CHF Professeur CUSO chambre double: 350 CHF Professeur CUSO chambre simple: 500 CHF Non CUSO universitaire chambre double: 600 CHF Non CUSO universitaire chambre simple: 750 CHF Non CUSO privé chambre double: 1100 CHF Non CUSO privé chambre simple: 1250 CHF |
||||||||||||||||||||||||||||||||||||||||||||||||||
Places | |||||||||||||||||||||||||||||||||||||||||||||||||||
Délai d'inscription | 20.01.2012 |