Information détaillée concernant le cours

[ Retour ]
Titre

Ecole d'Hiver 2012

ID CUSO

12220001

Dates

5 au 8 février 2012

Organisateur(s)/trice(s)
Intervenant-e-s

Professeurs R.A. Davis (USA), B.P. Carlin (USA) et N.W. Hengartner (USA)

Description
Professeur Richard A. Davis, du département de statistique de l’Université de Columbia

Title: Two Topics in Time Series Modeling

Lectures 1 and 2:  Detection of Structural Breaks and Outliers in Time Series Often, time series data exhibit nonstationarity in which segments look stationary, but the whole ensemble is nonstationary.  In the first 2 lectures, we will consider the problem of modeling a class of non-stationary time series with outliers using piecewise autoregressive (AR) processes. The number and locations of the piecewise autoregressive segments, as well as the orders of the respective AR processes, are assumed to be unknown and each piece may be contaminated with an unknown number of innovational and/or additive outliers. The minimum description length principle is applied to compare various segmented AR fits to the data. The goal is to find the “best” combination of the number of segments, the lengths of the segments, the orders of the piecewise AR processes, the number and type of outliers. Such a “best” combination is implicitly defined as the optimizer of a MDL criterion. Since the optimization is carried over a large number of configurations of segments and positions of outliers, a genetic algorithm is used to find optimal or near optimal solutions.  Strategies for accelerating the procedure will also be described.  Numerical results from simulation experiments and real data analyses show that the procedure enjoys excellent empirical properties.  The theory behind this procedure will also be discussed.(This is joint work with Thomas Lee and Gabriel Rodriguez-Yam.)

Lecture 3: Estimating Extremal Dependence in Time Series via the Extremogram The extremogram is a flexible quantitative tool that measures various types of extremal dependence in a stationary time series.  In many respects, the extremogram can be viewed as an extreme-value analogue of the autocorrelation function (ACF) for a time series.  Under mixing conditions, the asymptotic normality of the empirical extremogram was derived in Davis and Mikosch (2009).  Unfortunately, the limiting variance is a difficult quantity to estimate.  Instead we employ the stationary bootstrap to the empirical extremogram and establish that this resampling  procedure provides an asymptotically correct approximation to the central limit theorem.  This in turn can be used for constructing credible confidence bounds for the sample extremogram. The use of the stationary bootstrap for the extremogram is illustrated in a variety of real and simulated data sets. The cross-extremogram measures cross-sectional extremal dependence in multivariate time series. A measure of this dependence, especially the left tail dependence, is of great importance in the calculation of portfolio risk.  We find that after devolatilizing  the marginal series, extremal dependence still remains, which suggests that the extremal dependence is not due solely to the heteroskedasticity in the stock returns process. However, for the univariate series, the filtering removes all extremal dependence.  Following Geman and Chang (2010), a return time extremogram which measures the waiting time between rare or extreme events in univariate and bivariate stationary time series is calculated. The return time extremogram suggests the existence of extremal clustering in the return times of extreme events for financial assets. The stationary bootstrap can again provide an asymptotically correct approximation to the central limit theorem and can be used for constructing credible confidence bounds for this return time extremogram. 

Professeur Bradley P. Carlin, biostatisticien de l’Université de Minnesota

Title:  Bayesian Adaptive Methods for Clinical Trials 
 
Overview:  Thanks in large part to the rapid development of Markov chain Monte Carlo (MCMC) methods and software for their implementation, Bayesian methods have become ubiquitous in modern biostatistical analysis. In submissions to regulatory agencies where data on new drugs or medical devices are often scanty but researchers have access to large historical databases, Bayesian methods have emerged as particularly helpful in combining the disparate sources of information while maintaining traditional frequentist protections regarding Type I error and power.  Biostatisticians in earlier phases (especially Phase I oncology trials) have long appreciated Bayes' ability to get good answers quickly.  Finally, an increasing desire for adaptability in clinical trials (to react to trial knowledge as it accumulates) has also led to heightened interest in Bayesian methods.  This lecture series introduces Bayesian methods, computing, and software, and then goes on to elucidate their use in Phase I and II clinical trials.  We include descriptions of how the methods can be implemented in WinBUGS, R, and BRugs, a version of the BUGS package callable from within R. 
 
Lecture 1:  Introduction to Hierarchical Bayes Methods and Computing 
  Bayesian inference:  point and interval estimation, model choice 
  Bayesian computing:  MCMC methods; Gibbs sampler; 
  Metropolis-Hastings algorithm 
  Hierarchical modeling and metaanalysis 
  Principles of Bayesian clinical trial design:  predictive probability, indifference zone, Bayesia and frequentist operating characteristics (power, Type I error) 
 
Lecture 2:  Bayesian design and analysis for Phase I studies 
  Rule-based designs for determining the MTD (e.g., 3+3) 
  Model-based designs for determining the MTD (CRM, EWOC, TITE monitoring, toxicity intervals) 
  Dose ranging and optimal biologic dosing 
  Efficacy and toxicity 
  Examples and software 
 
Lecture 3:  Bayesian design and analysis for Phase II studies 
  Standard designs:  Phase IIA (single-arm) vs. Phase IIB (multi-arm) 
  Predictive probability-based methods 
  Sequential stopping:  for futility, efficacy 
  Multi-arm designs with adaptive randomization 
  Adaptive confirmatory trials:  adaptive sample size, futility analysis, arm dropping 
  Bayesian hierarchical methods in safety studies 
  Adaptive incorporation of historical data 
  Summary and Floor Discussion 

 

Professeur Nicolas W. Hengartner, du Laboratoire National de Los Alamos

Title: Statistics in action: statistical modeling in the physical sciences

Overview: Statistical modeling is an integral part of physical sciences, where it is used to model observations, estimate parameters and help assess the uncertainty of predictions. Developing these statistical models requires an inter-disciplinary approach that combines subject matter knowledge with mathematics and statistics. In this set of lectures, I present three example of statistical modeling in the physical sciences that represent the wide range of modeling encountered at the Los Alamos National Laboratory.

Lecture 1: Modeling correlated neutrons. A fi ssile material is capable of sustaining a chain reaction of nuclear ssion, a nuclear reaction that splits the nucleus of an atom into smaller parts and releases neutrons and photons. The distribution of the number of neutrons produced by a fission chain reaction characterizes the type and amount of the fi ssile material, and is therefore useful for radiological asseys. This first talk discusses estimating that distribution from the observed point process of detection times of individual neutrons. A description of that point process requires modeling both the process of generating neutrons using a modi ed branching process, and the process of detecting the neutrons using He3 based detectors which introduces random delays. This makes estimating the distribution of chain sizes challenging. We propose to estimate the chain size distribution by using a method of moments that involves the probability generating functional of the point process.

Lecture 2: Modeling cosmic muon scattering for passive imaging of high-Z materials. Soft cosmic rays are sub-atomic particles that collide with the upper atmosphere to produce pions that decay to muons, electrons and positrons. Most muons reach earth's crust where they penetrate into matter, where they interact through multiple Coulomb scattering. This makes it possible to use them as probes for tomographic imaging. This second talk discusses the modeling and statistical tools required for tomographic imaging. This problem has several interesting features: First, scattering is random, with the signal being in the variance instead of the mean. Second, it is possible to investigate a priori what features of the image can be easily reconstructed. Thus while this is an inverse problem, we can identify the "well posed questions in this ill posed problem".

Lecture 3: Statistical modeling of computer codes. Many scienti c phenomena are too complex to be modeled analytically (think of weather predictions for example) but remain amenable to numerical solutions using computer codes. These codes often depend on many input parameters that we wish to estimate from data. In this talk, I will discuss an approach to this problem, based on the idea of building an emulator (a surrogate) for the computationally expensive computer code. Issues with the current state-of-the-art will be discussed.

 

PROGRAMME:

 

              

 

 

dimanche 5.2

 

lundi 6.2

 

mardi 7.2

 

mercredi 8.2

 

08h30 - 10h00

 

 

Richard Davis

 

Richard Davis

 

Richard Davis
 

 

 

 

 

 

pause café

 

pause café

 

pause-café

 

10h30 - 12h00

 

 

 

Nicolas Hengartner

Nicolas Hengartner

 

Nicolas Hengartner

 

 

 

 

 

pause de midi

 

pause de midi

 

départ

14h00 - 16h00
 

 

 

dès 16h00
thé de bienvenue

 

16h30 pause café

 

16h30 pause café

 

 

 

17h00 - 18h30

 

Bradley Carlin
 

 

Bradley Carlin
 

 Bradley Carlin

 

 

     18h30 apéritif      

Présentation doctorant

 ?

Réunion

commission scientifique

                                               
 

 19h15 - 20h+

 

 REPAS

 

REPAS

 

REPAS

 

 

Lieu

Les Diablerets

Information
 Eurotel Victoria
 1865 Les Diablerets (VD)

Accès aux Diablerets EN VOITURE
Autoroute A9, direction Grand St-Bernard, sortie Aigle. Puis la route Aigle - Les Diablerets - Col du Pillon (20km).  EN AVION
 Aéroports internationaux de:
 - Genève (120 km)
 - Zürich (250 km)
 - Bâle (200 km)  EN TRAIN   (HORAIRE DES TRAINS - RAILWAY TIMETABLE)
 International
 TGV Paris - Lausanne. En hiver, TGV des Neiges Paris - Lausanne - Aigle. 
 Suisse
 Train schedule  : From: Geneve airport, To: Les Diablerets, gare.

 Trains directs jusqu'à Aigle. Ensuite train de montagne A.S.D (Aigle - Sépey - Diablerets)
 Durée des trajets: Lausanne - Aigle (30 minutes), Aigle - Les Diablerets (50 minutes).

 Visa pour la Suisse (Visa for Switzerland) Météo en suisse  (Quel temps fait-il dans notre pays ? - Weather in Switzerland ?)

Frais

Doctorant CUSO chambre double: 150 CHF

Doctorant CUSO chambre simple: 300 CHF

Post-doctorant CUSO chambre double: 250 CHF

Post-doctorant CUSO chambre simple: 400 CHF

Professeur CUSO chambre double: 350 CHF

Professeur CUSO chambre simple: 500 CHF

Non CUSO universitaire chambre double: 600 CHF

Non CUSO universitaire chambre simple: 750 CHF

Non CUSO privé chambre double: 1100 CHF

Non CUSO privé chambre simple: 1250 CHF

Places
Délai d'inscription 20.01.2012
short-url short URL

short-url URL onepage