967 resultados para Railroad safety, Bayesian methods, Accident modification factor, Countermeasure selection


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze crash data collected by the Iowa Department of Transportation using Bayesian methods. The data set includes monthly crash numbers, estimated monthly traffic volumes, site length and other information collected at 30 paired sites in Iowa over more than 20 years during which an intervention experiment was set up. The intervention consisted in transforming 15 undivided road segments from four-lane to three lanes, while an additional 15 segments, thought to be comparable in terms of traffic safety-related characteristics were not converted. The main objective of this work is to find out whether the intervention reduces the number of crashes and the crash rates at the treated sites. We fitted a hierarchical Poisson regression model with a change-point to the number of monthly crashes per mile at each of the sites. Explanatory variables in the model included estimated monthly traffic volume, time, an indicator for intervention reflecting whether the site was a “treatment” or a “control” site, and various interactions. We accounted for seasonal effects in the number of crashes at a site by including smooth trigonometric functions with three different periods to reflect the four seasons of the year. A change-point at the month and year in which the intervention was completed for treated sites was also included. The number of crashes at a site can be thought to follow a Poisson distribution. To estimate the association between crashes and the explanatory variables, we used a log link function and added a random effect to account for overdispersion and for autocorrelation among observations obtained at the same site. We used proper but non-informative priors for all parameters in the model, and carried out all calculations using Markov chain Monte Carlo methods implemented in WinBUGS. We evaluated the effect of the four to three-lane conversion by comparing the expected number of crashes per year per mile during the years preceding the conversion and following the conversion for treatment and control sites. We estimated this difference using the observed traffic volumes at each site and also on a per 100,000,000 vehicles. We also conducted a prospective analysis to forecast the expected number of crashes per mile at each site in the study one year, three years and five years following the four to three-lane conversion. Posterior predictive distributions of the number of crashes, the crash rate and the percent reduction in crashes per mile were obtained for each site for the months of January and June one, three and five years after completion of the intervention. The model appears to fit the data well. We found that in most sites, the intervention was effective and reduced the number of crashes. Overall, and for the observed traffic volumes, the reduction in the expected number of crashes per year and mile at converted sites was 32.3% (31.4% to 33.5% with 95% probability) while at the control sites, the reduction was estimated to be 7.1% (5.7% to 8.2% with 95% probability). When the reduction in the expected number of crashes per year, mile and 100,000,000 AADT was computed, the estimates were 44.3% (43.9% to 44.6%) and 25.5% (24.6% to 26.0%) for converted and control sites, respectively. In both cases, the difference in the percent reduction in the expected number of crashes during the years following the conversion was significantly larger at converted sites than at control sites, even though the number of crashes appears to decline over time at all sites. Results indicate that the reduction in the expected number of sites per mile has a steeper negative slope at converted than at control sites. Consistent with this, the forecasted reduction in the number of crashes per year and mile during the years after completion of the conversion at converted sites is more pronounced than at control sites. Seasonal effects on the number of crashes have been well-documented. In this dataset, we found that, as expected, the expected number of monthly crashes per mile tends to be higher during winter months than during the rest of the year. Perhaps more interestingly, we found that there is an interaction between the four to three-lane conversion and season; the reduction in the number of crashes appears to be more pronounced during months, when the weather is nice than during other times of the year, even though a reduction was estimated for the entire year. Thus, it appears that the four to three-lane conversion, while effective year-round, is particularly effective in reducing the expected number of crashes in nice weather.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide methods for forecasting variables and predicting turning points in panel Bayesian VARs. We specify a flexible model which accounts for both interdependencies in the cross section and time variations in the parameters. Posterior distributions for the parameters are obtained for a particular type of diffuse, for Minnesota-type and for hierarchical priors. Formulas for multistep, multiunit point and average forecasts are provided. An application to the problem of forecasting the growth rate of output and of predicting turning points in the G-7 illustrates the approach. A comparison with alternative forecasting methods is also provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interest in solar ultraviolet (UV) radiation from the scientific community and the general population has risen significantly in recent years because of the link between increased UV levels at the Earth's surface and depletion of ozone in the stratosphere. As a consequence of recent research, UV radiation climatologies have been developed, and effects of some atmospheric constituents (such as ozone or aerosols) have been studied broadly. Correspondingly, there are well-established relationships between, for example, total ozone column and UV radiation levels at the Earth's surface. Effects of clouds, however, are not so well described, given the intrinsic difficulties in properly describing cloud characteristics. Nevertheless, the effect of clouds cannot be neglected, and the variability that clouds induce on UV radiation is particularly significant when short timescales are involved. In this review we show, summarize, and compare several works that deal with the effect of clouds on UV radiation. Specifically, works reviewed here approach the issue from the empirical point of view: Some relationship between measured UV radiation in cloudy conditions and cloud-related information is given in each work. Basically, there are two groups of methods: techniques that are based on observations of cloudiness (either from human observers or by using devices such as sky cameras) and techniques that use measurements of broadband solar radiation as a surrogate for cloud observations. Some techniques combine both types of information. Comparison of results from different works is addressed through using the cloud modification factor (CMF) defined as the ratio between measured UV radiation in a cloudy sky and calculated radiation for a cloudless sky. Typical CMF values for overcast skies range from 0.3 to 0.7, depending both on cloud type and characteristics. Despite this large dispersion of values corresponding to the same cloud cover, it is clear that the cloud effect on UV radiation is 15–45% lower than the cloud effect on total solar radiation. The cloud effect is usually a reducing effect, but a significant number of works report an enhancement effect (that is increased UV radiation levels at the surface) due to the presence of clouds. The review concludes with some recommendations for future studies aimed to further analyze the cloud effects on UV radiation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To assess the prevalence of cardiovascular (CV) risk factors in Seychelles, a middle-income African country, and compare the cost-effectiveness of single-risk-factor management (treating individuals with arterial blood pressure >/= 140/90 mmHg and/or total serum cholesterol >/= 6.2 mmol/l) with that of management based on total CV risk (treating individuals with a total CV risk >/= 10% or >/= 20%).METHODS: CV risk factor prevalence and a CV risk prediction chart for Africa were used to estimate the 10-year risk of suffering a fatal or non-fatal CV event among individuals aged 40-64 years. These figures were used to compare single-risk-factor management with total risk management in terms of the number of people requiring treatment to avert one CV event and the number of events potentially averted over 10 years. Treatment for patients with high total CV risk (>/= 20%) was assumed to consist of a fixed-dose combination of several drugs (polypill). Cost analyses were limited to medication.FINDINGS: A total CV risk of >/= 10% and >/= 20% was found among 10.8% and 5.1% of individuals, respectively. With single-risk-factor management, 60% of adults would need to be treated and 157 cardiovascular events per 100 000 population would be averted per year, as opposed to 5% of adults and 92 events with total CV risk management. Management based on high total CV risk optimizes the balance between the number requiring treatment and the number of CV events averted.CONCLUSION: Total CV risk management is much more cost-effective than single-risk-factor management. These findings are relevant for all countries, but especially for those economically and demographically similar to Seychelles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research was undertaken, sponsored by the Iowa Department of Transportation, to identify specific locations where rumble strips could be expected to improve highway safety. The objective of the research was to recommend warrants for their use on rural highways. An inventory of rumble strip installations on the rural highway systems in the state was conducted in 1981. A total of 685 installations was reported on secondary roads and 147 on primary highways. Over 97 percent of these were in advance of stop signs at. intersections. Most of the other installations were in advance of railroad grade crossings. The accident experience with and without rumble strips was compared in two ways. A before-and-after comparison was made for the same location if accident records were available for at least one full year both preceding and following the installation of rumble strips. Accident records for this purpose were available from a statewide computerized record system covering the period from 1977 through 1980. The accident experience at locations having rumble strips installed before 1978 was compared with a sample of comparable locations not having rumble strips.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: This study aimed to survey current practices in European epilepsy monitoring units (EMUs) with emphasis on safety issues. METHODS: A 37-item questionnaire investigating characteristics and organization of EMUs, including measures for prevention and management of seizure-related serious adverse events (SAEs), was distributed to all identified European EMUs plus one located in Israel (N=150). RESULTS: Forty-eight (32%) EMUs, located in 18 countries, completed the questionnaire. Epilepsy monitoring unit beds are 1-2 in 43%, 3-4 in 34%, and 5-6 in 19% of EMUs; staff physicians are 1-2 in 32%, 3-4 in 34%, and 5-6 in 19% of EMUs. Personnel operating in EMUs include epileptologists (in 69% of EMUs), clinical neurophysiologists trained in epilepsy (in 46% of EMUs), child neurologists (in 35% of EMUs), neurology and clinical neurophysiology residents (in 46% and in 8% of EMUs, respectively), and neurologists not trained in epilepsy (in 27% of EMUs). In 20% of EMUs, patients' observation is only intermittent or during the daytime and primarily carried out by neurophysiology technicians and/or nurses (in 71% of EMUs) or by patients' relatives (in 40% of EMUs). Automatic detection systems for seizures are used in 15%, for body movements in 8%, for oxygen desaturation in 33%, and for ECG abnormalities in 17% of EMUs. Protocols for management of acute seizures are lacking in 27%, of status epilepticus in 21%, and of postictal psychoses in 87% of EMUs. Injury prevention consists of bed protections in 96% of EMUs, whereas antisuffocation pillows are employed in 21%, and environmental protections in monitoring rooms and in bathrooms are implemented in 38% and in 25% of EMUs, respectively. The most common SAEs were status epilepticus reported by 79%, injuries by 73%, and postictal psychoses by 67% of EMUs. CONCLUSIONS: All EMUs have faced different types of SAEs. Wide variation in practice patterns and lack of protocols and of precautions to ensure patients' safety might promote the occurrence and severity of SAEs. Our findings highlight the need for standardized and shared protocols for an effective and safe management of patients in EMUs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tiivistelmä Lainsäädäntö ohjaa yrityksien työsuojelutoimintaa varsin tiukasti. Menestyvälle yritykselle ei kuitenkaan enää riitä lainsäädännön minimitason saavuttaminen. Työturvallisuudesta on tullut merkittävä menestystekijä kilpailtaessa rajallisista resursseista, kuten osaavasta työvoimasta ja rahoittajista. Diplomityössä selvitetään työturvallisuuden kehittämiskeinoja teollisuusyrityksessä. Alalla toimii useita viranomais- ja asiantuntijaorganisaatioita, joiden tehtäviin kuuluu valvoa, että lainsäädäntöä noudatetaan. Valvonnan lisäksi ne myös tarjoavat opastusta työsuojeluasioissa. Työsuojelun kehittämisessä on tärkeää johdon asenne ja toiminta työsuojeluasioissa. Turvallisuutta tulee johtaa samalla tavoin kuin yrityksen muutakin toimintaa. Turvallisuusjohtamisen menetelmiä ovat muun muassa riskien arviointi, tapaturmien tutkinta ja työsuojelukoulutuksen järjestäminen. Myös hyvän turvallisuuskulttuurin luominen yritykseen kuuluu johdon haasteisiin. Turvallisuustoiminnan tilaa kohdeyrityksessä arvioitiin Lappeenrannan teknillisessä korkeakoulussa kehitetyllä Laatua turvallisuustoimintaan (LATU) -menetelmällä. Tuloksien perusteella valittiin lähempää tarkastelua varten tapaturmien tutkinta, fyysinen kuormitus ja sisäinen liikenne. Näistä kerättiin tietoa haastatteluin ja keskusteluin niin työntekijöiden, työnjohdon kuin keskijohdonkin kanssa sekä lisäksi omakohtaisesti toimintaa havainnoimalla.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variability observed in drug exposure has a direct impact on the overall response to drug. The largest part of variability between dose and drug response resides in the pharmacokinetic phase, i.e. in the dose-concentration relationship. Among possibilities offered to clinicians, Therapeutic Drug Monitoring (TDM; Monitoring of drug concentration measurements) is one of the useful tool to guide pharmacotherapy. TDM aims at optimizing treatments by individualizing dosage regimens based on blood drug concentration measurement. Bayesian calculations, relying on population pharmacokinetic approach, currently represent the gold standard TDM strategy. However, it requires expertise and computational assistance, thus limiting its large implementation in routine patient care. The overall objective of this thesis was to implement robust tools to provide Bayesian TDM to clinician in modern routine patient care. To that endeavour, aims were (i) to elaborate an efficient and ergonomic computer tool for Bayesian TDM: EzeCHieL (ii) to provide algorithms for drug concentration Bayesian forecasting and software validation, relying on population pharmacokinetics (iii) to address some relevant issues encountered in clinical practice with a focus on neonates and drug adherence. First, the current stage of the existing software was reviewed and allows establishing specifications for the development of EzeCHieL. Then, in close collaboration with software engineers a fully integrated software, EzeCHieL, has been elaborated. EzeCHieL provides population-based predictions and Bayesian forecasting and an easy-to-use interface. It enables to assess the expectedness of an observed concentration in a patient compared to the whole population (via percentiles), to assess the suitability of the predicted concentration relative to the targeted concentration and to provide dosing adjustment. It allows thus a priori and a posteriori Bayesian drug dosing individualization. Implementation of Bayesian methods requires drug disposition characterisation and variability quantification trough population approach. Population pharmacokinetic analyses have been performed and Bayesian estimators have been provided for candidate drugs in population of interest: anti-infectious drugs administered to neonates (gentamicin and imipenem). Developed models were implemented in EzeCHieL and also served as validation tool in comparing EzeCHieL concentration predictions against predictions from the reference software (NONMEM®). Models used need to be adequate and reliable. For instance, extrapolation is not possible from adults or children to neonates. Therefore, this work proposes models for neonates based on the developmental pharmacokinetics concept. Patients' adherence is also an important concern for drug models development and for a successful outcome of the pharmacotherapy. A last study attempts to assess impact of routine patient adherence measurement on models definition and TDM interpretation. In conclusion, our results offer solutions to assist clinicians in interpreting blood drug concentrations and to improve the appropriateness of drug dosing in routine clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Knockout studies of the murine Nuclear Factor I-C (NFI-C) transcription factor revealed abnormal skin wound healing and growth of its appendages, suggesting a role in controlling cell proliferation in adult regenerative processes. Liver regeneration following partial hepatectomy (PH) is a well-established regenerative model whereby changes elicited in hepatocytes lead to their rapid and phased proliferation. Although NFI-C is highly expressed in the liver, no hepatic function was yet established for this transcription factor. This study aimed to determine whether NFI-C may play a role in hepatocyte proliferation and liver regeneration. METHODS: Liver regeneration and cell proliferation pathways following two-thirds PH were investigated in NFI-C knockout (ko) and wild-type (wt) mice. RESULTS: We show that the absence of NFI-C impaired hepatocyte proliferation because of plasminogen activator I (PAI-1) overexpression and the subsequent suppression of urokinase plasminogen activator (uPA) activity and hepatocyte growth factor (HGF) signalling, a potent hepatocyte mitogen. This indicated that NFI-C first acts to promote hepatocyte proliferation at the onset of liver regeneration in wt mice. The subsequent transient down regulation of NFI-C, as can be explained by a self-regulatory feedback loop with transforming growth factor beta 1 (TGF-ß1), may limit the number of hepatocytes entering the first wave of cell division and/or prevent late initiations of mitosis. CONCLUSION: NFI-C acts as a regulator of the phased hepatocyte proliferation during liver regeneration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fatal and permanently disabling accidents form only one per I cent of all occupational accidents but in many branches of industry they account for more than half the accident costs. Furthermore the human suffering of the victim and his family is greater in severe accidents than in slight ones. For both human and economic reasons the severe accident risks should be identified befor injuries occur. It is for this purpose that different safety analysis methods have been developed . This study shows two new possible approaches to the problem.. The first is the hypothesis that it is possible to estimate the potential severity of accidents independent of the actual severity. The second is the hypothesis that when workers are also asked to report near accidents, they are particularly prone to report potentially severe near accidents on the basis of their own subjective risk assessment. A field study was carried out in a steel factory. The results supported both the hypotheses. The reliability and the validity of post incident estimates of an accident's potential severity were reasonable. About 10 % of accidents were estimated to be potentially critical; they could have led to death or very severe permanent disability. Reported near accidents were significantly more severe, about 60 $ of them were estimated to be critical. Furthermore the validity of workers subjective risk assessment, manifested in the near accident reports, proved to be reasonable. The studied new methods require further development and testing. They could be used both in routine usage in work places and in research for identifying and setting the priorities of accident risks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research is to draw up a clear construction of an anticipatory communicative decision-making process and a successful implementation of a Bayesian application that can be used as an anticipatory communicative decision-making support system. This study is a decision-oriented and constructive research project, and it includes examples of simulated situations. As a basis for further methodological discussion about different approaches to management research, in this research, a decision-oriented approach is used, which is based on mathematics and logic, and it is intended to develop problem solving methods. The approach is theoretical and characteristic of normative management science research. Also, the approach of this study is constructive. An essential part of the constructive approach is to tie the problem to its solution with theoretical knowledge. Firstly, the basic definitions and behaviours of an anticipatory management and managerial communication are provided. These descriptions include discussions of the research environment and formed management processes. These issues define and explain the background to further research. Secondly, it is processed to managerial communication and anticipatory decision-making based on preparation, problem solution, and solution search, which are also related to risk management analysis. After that, a solution to the decision-making support application is formed, using four different Bayesian methods, as follows: the Bayesian network, the influence diagram, the qualitative probabilistic network, and the time critical dynamic network. The purpose of the discussion is not to discuss different theories but to explain the theories which are being implemented. Finally, an application of Bayesian networks to the research problem is presented. The usefulness of the prepared model in examining a problem and the represented results of research is shown. The theoretical contribution includes definitions and a model of anticipatory decision-making. The main theoretical contribution of this study has been to develop a process for anticipatory decision-making that includes management with communication, problem-solving, and the improvement of knowledge. The practical contribution includes a Bayesian Decision Support Model, which is based on Bayesian influenced diagrams. The main contributions of this research are two developed processes, one for anticipatory decision-making, and the other to produce a model of a Bayesian network for anticipatory decision-making. In summary, this research contributes to decision-making support by being one of the few publicly available academic descriptions of the anticipatory decision support system, by representing a Bayesian model that is grounded on firm theoretical discussion, by publishing algorithms suitable for decision-making support, and by defining the idea of anticipatory decision-making for a parallel version. Finally, according to the results of research, an analysis of anticipatory management for planned decision-making is presented, which is based on observation of environment, analysis of weak signals, and alternatives to creative problem solving and communication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä diplomityössä esitetään selvitys käytössä olevista deterministisistä turvallisuusanalyysimenetelmistä. Deterministisillä turvallisuusanalyyseillä arvioidaan ydinvoimalaitosten turvallisuutta eri käyttötilojen aikana. Voimalaitoksen turvallisuusjärjestelmät mitoitetaan deterministisen turvallisuusanalyysin tulosten perusteella. Deterministiset turvallisuusanalyysit voidaan laatia konservatiivista tai tilastollista menetelmää käyttäen. Konservatiivinen menetelmä pyrkii mallintamaan tarkasteltavan tilanteen siten, että laitoksen todellinen käyttäytyminen on hyvällä varmuudella lievempää kuin analyysitulos. Konservatiivisessa menetelmässä analyysin epävarmuudet huomioidaan konservatiivisilla oletuksilla. Tilastollinen menetelmä perustuu parhaan arvion menetelmään eli pyrkimykseen mallintaa laitoksen käyttäytyminen mahdollisimman todenmukaisesti. Tilastollisessa menetelmässä analyysin epävarmuudet määritetään systemaattisesti tilastomatematiikan keinoin. Työssä painotetaan tilastollisen analyysin epävarmuuksien määritykseen käytettäviä epävarmuustarkastelumenetelmiä. Diplomityön laskennallisessa osassa vertaillaan deterministisen turvallisuusanalyysin laadintaan käytettäviä menetelmiä termohydraulisen turvallisuusanalyysiesimerkin laskennan kautta. Laskennassa tarkasteltavana onnettomuutena on Olkiluoto 3-laitosyksikössä tapahtuva primäärijäähdytepiirin putkikatkosta aiheutuva jäähdytteenmenetysonnettomuus. Lasketun esimerkkitapauksen perusteella tilastollista ja konservatiivista menetelmää voidaan pitää vaihtoehtoisina turvallisuusanalyysin laadintaan. Molemmat analyysit tuottivat hyväksyttäviä ja toisilleen verrannollisia tuloksia, joiden suuruusluokka on sama.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Results of subgroup analysis (SA) reported in randomized clinical trials (RCT) cannot be adequately interpreted without information about the methods used in the study design and the data analysis. Our aim was to show how often inaccurate or incomplete reports occur. First, we selected eight methodological aspects of SA on the basis of their importance to a reader in determining the confidence that should be placed in the author's conclusions regarding such analysis. Then, we reviewed the current practice of reporting these methodological aspects of SA in clinical trials in four leading journals, i.e., the New England Journal of Medicine, the Journal of the American Medical Association, the Lancet, and the American Journal of Public Health. Eight consecutive reports from each journal published after July 1, 1998 were included. Of the 32 trials surveyed, 17 (53%) had at least one SA. Overall, the proportion of RCT reporting a particular methodological aspect ranged from 23 to 94%. Information on whether the SA preceded/followed the analysis was reported in only 7 (41%) of the studies. Of the total possible number of items to be reported, NEJM, JAMA, Lancet and AJPH clearly mentioned 59, 67, 58 and 72%, respectively. We conclude that current reporting of SA in RCT is incomplete and inaccurate. The results of such SA may have harmful effects on treatment recommendations if accepted without judicious scrutiny. We recommend that editors improve the reporting of SA in RCT by giving authors a list of the important items to be reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

McCausland (2004a) describes a new theory of random consumer demand. Theoretically consistent random demand can be represented by a \"regular\" \"L-utility\" function on the consumption set X. The present paper is about Bayesian inference for regular L-utility functions. We express prior and posterior uncertainty in terms of distributions over the indefinite-dimensional parameter set of a flexible functional form. We propose a class of proper priors on the parameter set. The priors are flexible, in the sense that they put positive probability in the neighborhood of any L-utility function that is regular on a large subset bar(X) of X; and regular, in the sense that they assign zero probability to the set of L-utility functions that are irregular on bar(X). We propose methods of Bayesian inference for an environment with indivisible goods, leaving the more difficult case of indefinitely divisible goods for another paper. We analyse individual choice data from a consumer experiment described in Harbaugh et al. (2001).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.