902 resultados para Rare Events
Resumo:
We prove that for any a-mixing stationary process the hitting time of any n-string A(n) converges, when suitably normalized, to an exponential law. We identify the normalization constant lambda(A(n)). A similar statement holds also for the return time. To establish this result we prove two other results of independent interest. First, we show a relation between the rescaled hitting time and the rescaled return time, generalizing a theorem of Haydn, Lacroix and Vaienti. Second, we show that for positive entropy systems, the probability of observing any n-string in n consecutive observations goes to zero as n goes to infinity. (c) 2010 Elsevier B.V. All rights reserved.
Resumo:
We aim to provide a review of the stochastic discount factor bounds usually applied to diagnose asset pricing models. In particular, we mainly discuss the bounds used to analyze the disaster model of Barro (2006). Our attention is focused in this disaster model since the stochastic discount factor bounds that are applied to study the performance of disaster models usually consider the approach of Barro (2006). We first present the entropy bounds that provide a diagnosis of the analyzed disaster model which are the methods of Almeida and Garcia (2012, 2016); Ghosh et al. (2016). Then, we discuss how their results according to the disaster model are related to each other and also present the findings of other methodologies that are similar to these bounds but provide different evidence about the performance of the framework developed by Barro (2006).
Resumo:
Stochastic simulation is an important and practical technique for computing probabilities of rare events, like the payoff probability of a financial option, the probability that a queue exceeds a certain level or the probability of ruin of the insurer's risk process. Rare events occur so infrequently, that they cannot be reasonably recorded during a standard simulation procedure: specifc simulation algorithms which thwart the rarity of the event to simulate are required. An important algorithm in this context is based on changing the sampling distribution and it is called importance sampling. Optimal Monte Carlo algorithms for computing rare event probabilities are either logarithmic eficient or possess bounded relative error.
Resumo:
Predicting failures in a distributed system based on previous events through logistic regression is a standard approach in literature. This technique is not reliable, though, in two situations: in the prediction of rare events, which do not appear in enough proportion for the algorithm to capture, and in environments where there are too many variables, as logistic regression tends to overfit on this situations; while manually selecting a subset of variables to create the model is error- prone. On this paper, we solve an industrial research case that presented this situation with a combination of elastic net logistic regression, a method that allows us to automatically select useful variables, a process of cross-validation on top of it and the application of a rare events prediction technique to reduce computation time. This process provides two layers of cross- validation that automatically obtain the optimal model complexity and the optimal mode l parameters values, while ensuring even rare events will be correctly predicted with a low amount of training instances. We tested this method against real industrial data, obtaining a total of 60 out of 80 possible models with a 90% average model accuracy.
Resumo:
Temperature chaos has often been reported in the literature as a rare-event–driven phenomenon. However, this fact has always been ignored in the data analysis, thus erasing the signal of the chaotic behavior (still rare in the sizes achieved) and leading to an overall picture of a weak and gradual phenomenon. On the contrary, our analysis relies on a largedeviations functional that allows to discuss the size dependences. In addition, we had at our disposal unprecedentedly large configurations equilibrated at low temperatures, thanks to the Janus computer. According to our results, when temperature chaos occurs its effects are strong and can be felt even at short distances.
Resumo:
La meva incorporació al grup de recerca del Prof. McCammon (University of California San Diego) en qualitat d’investigador post doctoral amb una beca Beatriu de Pinós, va tenir lloc el passat 1 de desembre de 2010; on vaig dur a terme les meves tasques de recerca fins al darrer 1 d’abril de 2012. El Prof. McCammon és un referent mundial en l’aplicació de simulacions de dinàmica molecular (MD) en sistemes biològics d’interès humà. La contribució més important del Prof. McCammon en la simulació de sistemes biològics és el desenvolupament del mètode de dinàmiques moleculars accelerades (AMD). Les simulacions MD convencionals, les quals estan limitades a l’escala de temps del nanosegon (~10-9s), no son adients per l’estudi de sistemes biològics rellevants a escales de temps mes llargues (μs, ms...). AMD permet explorar fenòmens moleculars poc freqüents però que son clau per l’enteniment de molts sistemes biològics; fenòmens que no podrien ser observats d’un altre manera. Durant la meva estada a la “University of California San Diego”, vaig treballar en diferent aplicacions de les simulacions AMD, incloent fotoquímica i disseny de fàrmacs per ordinador. Concretament, primer vaig desenvolupar amb èxit una combinació dels mètodes AMD i simulacions Car-Parrinello per millorar l’exploració de camins de desactivació (interseccions còniques) en reaccions químiques fotoactivades. En segon lloc, vaig aplicar tècniques estadístiques (Replica Exchange) amb AMD en la descripció d’interaccions proteïna-lligand. Finalment, vaig dur a terme un estudi de disseny de fàrmacs per ordinador en la proteïna-G Rho (involucrada en el desenvolupament de càncer humà) combinant anàlisis estructurals i simulacions AMD. Els projectes en els quals he participat han estat publicats (o estan encara en procés de revisió) en diferents revistes científiques, i han estat presentats en diferents congressos internacionals. La memòria inclosa a continuació conté més detalls de cada projecte esmentat.
Resumo:
Intense extra-tropical cyclones are often associated with strong winds, heavy precipitation and socio-economic impacts. Over southwestern Europe, such storms occur less often, but still cause high economic losses. We characterise the largescale atmospheric conditions and cyclone tracks during the top-100 potential losses over Iberia associated with wind events. Based on 65 years of reanalysis data,events are classified into four groups: (i) cyclone tracks crossing over Iberia on the event day (“Iberia”), (ii) cyclones crossing further north, typically southwest of the British Isles (“North”), (iii) cyclones crossing southwest to northeast near the northwest tip of Iberia (“West”), and (iv) so called “Hybrids”, characterised by a strong pressure gradient over Iberia due to the juxtaposition of low and high pressure centres. Generally, “Iberia” events are the most frequent (31% to 45% for top-100 vs.top-20), while “West” events are rare (10% to 12%). 70% of the events were primarily associated with a cyclone. Multi-decadal variability in the number of events is identified. While the peak in recent years is quite prominent, other comparably stormy periods occurred in the 1960s and 1980s. This study documents that damaging wind storms over Iberia are not rare events, and their frequency of occurrence undergoes strong multi-decadal variability.
Resumo:
We present a novel method, called the transform likelihood ratio (TLR) method, for estimation of rare event probabilities with heavy-tailed distributions. Via a simple transformation ( change of variables) technique the TLR method reduces the original rare event probability estimation with heavy tail distributions to an equivalent one with light tail distributions. Once this transformation has been established we estimate the rare event probability via importance sampling, using the classical exponential change of measure or the standard likelihood ratio change of measure. In the latter case the importance sampling distribution is chosen from the same parametric family as the transformed distribution. We estimate the optimal parameter vector of the importance sampling distribution using the cross-entropy method. We prove the polynomial complexity of the TLR method for certain heavy-tailed models and demonstrate numerically its high efficiency for various heavy-tailed models previously thought to be intractable. We also show that the TLR method can be viewed as a universal tool in the sense that not only it provides a unified view for heavy-tailed simulation but also can be efficiently used in simulation with light-tailed distributions. We present extensive simulation results which support the efficiency of the TLR method.
Resumo:
Maternal mortality (MM) is a core indicator of disparities in women's rights. The study of Near Miss cases is strategic to identifying the breakdowns in obstetrical care. In absolute numbers, both MM and occurrence of eclampsia are rare events. We aim to assess the obstetric care indicators and main predictors for severe maternal outcome from eclampsia (SMO: maternal death plus maternal near miss). Secondary analysis of a multicenter, cross-sectional study, including 27 centers from all geographic regions of Brazil, from 2009 to 2010. 426 cases of eclampsia were identified and classified according to the outcomes: SMO and non-SMO. We classified facilities as coming from low- and high-income regions and calculated the WHO's obstetric health indicators. SPSS and Stata softwares were used to calculate the prevalence ratios (PR) and respective 95% confidence interval (CI) to assess maternal characteristics, clinical and obstetrical history, and access to health services as predictors for SMO, subsequently correlating them with the corresponding perinatal outcomes, also applying multiple regression analysis (adjusted for cluster effect). Prevalence of and mortality indexes for eclampsia in higher and lower income regions were 0.2%/0.8% and 8.1%/22%, respectively. Difficulties in access to health care showed that ICU admission (adjPR 3.61; 95% CI 1.77-7.35) and inadequate monitoring (adjPR 2.31; 95% CI 1.48-3.59) were associated with SMO. Morbidity and mortality associated with eclampsia were high in Brazil, especially in lower income regions. Promoting quality maternal health care and improving the availability of obstetric emergency care are essential actions to relieve the burden of eclampsia.
Resumo:
This paper proposes the creation of an objectively acquired reference database to more accurately characterize the incidence and longterm risk of relatively infrequent, but serious, adverse events. Such a database would be maintained longitudinally to provide for ongoing comparison with new rheumatologic drug safety databases collecting the occurrences and treatments of rare events, We propose the establishment of product-specific registries to prospectively follow a cohort of patients with rheumatoid arthritis (RA) who receive newly approved therapies. In addition, a database is required of a much larger cohort of RA patients treated with multiple second line agents of sufficient size to enable case-controlled determinations of the relative incidence of rare but serious events in the treated (registry) versus the larger disease population, The number of patients necessary for agent-specific registries and a larger patient population adequate to supply a matched case-control cohort will depend upon estimates of the detectability of an increased incidence over background. We suggest a system to carry out this proposal that will involve an umbrella organization. responsible for establishment of this large patient cohort, envisioned to be drawn from around the world.
Resumo:
The splitting method is a simulation technique for the estimation of very small probabilities. In this technique, the sample paths are split into multiple copies, at various stages in the simulation. Of vital importance to the efficiency of the method is the Importance Function (IF). This function governs the placement of the thresholds or surfaces at which the paths are split. We derive a characterisation of the optimal IF and show that for multi-dimensional models the natural choice for the IF is usually not optimal. We also show how nearly optimal splitting surfaces can be derived or simulated using reverse time analysis. Our numerical experiments illustrate that by using the optimal IF, one can obtain a significant improvement in simulation efficiency.
Resumo:
Some of the properties sought in seismic design of buildings are also considered fundamental to guarantee structural robustness. Moreover, some key concepts are common to both seismic and robustness design. In fact, both analyses consider events with a very small probability of occurrence, and consequently, a significant level of damage is admissible. As very rare events,in both cases, the actions are extremely hard to quantify. The acceptance of limited damage requires a system based analysis of structures, rather than an element by element methodology, as employed for other load cases. As for robustness analysis, in seismic design the main objective is to guarantee that the structure survives an earthquake, without extensive damage. In the case of seismic design, this is achieved by guaranteeing the dissipation of energy through plastic hinges distributed in the structure. For this to be possible, some key properties must be assured, in particular ductility and redundancy. The same properties could be fundamental in robustness design, as a structure can only sustain significant damage if capable of distributing stresses to parts of the structure unaffected by the triggering event. Timber is often used for primary load‐bearing elements in single storey long‐span structures for public buildings and arenas, where severe consequences can be expected if one or more of the primary load bearing elements fail. The structural system used for these structures consists of main frames, secondary elements and bracing elements. The main frame, composed by columns and beams, can be seen as key elements in the system and should be designed with high safety against failure and under strict quality control. The main frames may sometimes be designed with moment resisting joints between columns and beams. Scenarios, where one or more of these key elements, fail should be considered at least for high consequence buildings. Two alternative strategies may be applied: isolation of collapsing sections and, provision of alternate load paths [1]. The first one is relatively straightforward to provide by deliberately designing the secondary structural system less strong and stiff. Alternatively, the secondary structural system and the bracing system can be design so that loss of capacity in the main frame does not lead to the collapse. A case study has been selected aiming to assess the consequences of these two different strategies, in particular, under seismic loads.
Resumo:
As empresas nacionais deparam-se com a necessidade de responder ao mercado com uma grande variedade de produtos, pequenas séries e prazos de entrega reduzidos. A competitividade das empresas num mercado global depende assim da sua eficiência, da sua flexibilidade, da qualidade dos seus produtos e de custos reduzidos. Para se atingirem estes objetivos é necessário desenvolverem-se estratégias e planos de ação que envolvem os equipamentos produtivos, incluindo: a criação de novos equipamentos complexos e mais fiáveis, alteração dos equipamentos existentes modernizando-os de forma a responderem às necessidades atuais e a aumentar a sua disponibilidade e produtividade; e implementação de políticas de manutenção mais assertiva e focada no objetivo de “zero avarias”, como é o caso da manutenção preditiva. Neste contexto, o objetivo principal deste trabalho consiste na previsão do instante temporal ótimo da manutenção de um equipamento industrial – um refinador da fábrica de Mangualde da empresa Sonae Industria, que se encontra em funcionamento contínuo 24 horas por dia, 365 dias por ano. Para o efeito são utilizadas medidas de sensores que monitorizam continuamente o estado do refinador. A principal operação de manutenção deste equipamento é a substituição de dois discos metálicos do seu principal componente – o desfibrador. Consequentemente, o sensor do refinador analisado com maior detalhe é o sensor que mede a distância entre os dois discos do desfibrador. Os modelos ARIMA consistem numa abordagem estatística avançada para previsão de séries temporais. Baseados na descrição da autocorrelação dos dados, estes modelos descrevem uma série temporal como função dos seus valores passados. Neste trabalho, a metodologia ARIMA é utilizada para determinar um modelo que efetua uma previsão dos valores futuros do sensor que mede a distância entre os dois discos do desfibrador, determinando-se assim o momento ótimo da sua substituição e evitando paragens forçadas de produção por ocorrência de uma falha por desgaste dos discos. Os resultados obtidos neste trabalho constituem uma contribuição científica importante para a área da manutenção preditiva e deteção de falhas em equipamentos industriais.
Resumo:
Extreme value theory (EVT) deals with the occurrence of extreme phenomena. The tail index is a very important parameter appearing in the estimation of the probability of rare events. Under a semiparametric framework, inference requires the choice of a number k of upper order statistics to be considered. This is the crux of the matter and there is no definite formula to do it, since a small k leads to high variance and large values of k tend to increase the bias. Several methodologies have emerged in literature, specially concerning the most popular Hill estimator (Hill, 1975). In this work we compare through simulation well-known procedures presented in Drees and Kaufmann (1998), Matthys and Beirlant (2000), Beirlant et al. (2002) and de Sousa and Michailidis (2004), with a heuristic scheme considered in Frahm et al. (2005) within the estimation of a different tail measure but with a similar context. We will see that the new method may be an interesting alternative.
Resumo:
SUMMARYIn order to increase drug safety we must better understand how medication interacts with the body of our patients and this knowledge should be made easily available for the clinicians prescribing the medication. This thesis contributes to how the knowledge of some drug properties can increase and how to make information readily accessible for the medical professionals. Furthermore it investigates the use of Therapeutic drug monitoring, drug interaction databases and pharmacogenetic tests in pharmacovigilance.Two pharmacogenetic studies in the naturalistic setting of psychiatric in-patients clinics have been performed; one with the antidepressant mirtazapine, the other with the antipsychotic clozapine. Forty-five depressed patients have been treated with mirtazapine and were followed for 8 weeks. The therapeutic effect was as seen in other previous studies. Enantioselective analyses could confirm an influence of age, gender and smoking in the pharmacokinetics of mirtazapine; it showed a significant influence of the CYP2D6 genotype on the antidepressant effective S-enantiomer, and for the first time an influence of the CYP2B6 genotype on the plasma concentrations of the 8-OH metabolite was found. The CYP2B6*/*6 genotype was associated to better treatment response. A detailed hypothesis of the metabolic pathways of mirtazapine is proposed. In the second pharmacogenetic study, analyses of 75 schizophrenic patients treated with clozapine showed the influence of CYP450 and ABCB1 genotypes on its pharmacokinetics. For the first time we could demonstrate an in vivo effect of the CYP2C19 genotype and an influence of P-glycoprotein on the plasma concentrations of clozapine. Further we confirmed in vivo the prominent role of CYP1A2 in the metabolism of clozapine.Identifying risk factors for the occurrence of serious adverse drug reactions (SADR) would allow a more individualized and safer drug therapy. SADR are rare events and therefore difficult to study. We tested the feasibility of a nested matched case-control study to examine the influence of high drug plasma levels and CYP2D6 genotypes on the risk to experience an SADR. In our sample we compared 62 SADR cases with 82 controls; both groups were psychiatric patients from the in-patient clinic Königsfelden. Drug plasma levels of >120% of the upper recommended references could be identified as a risk factor with a statistically significant odds ratio of 3.5, a similar trend could be seen for CYP2D6 poor metaboliser. Although a matched case-control design seems a valid method, 100% matching is not easy to perform in a relative small cohort of one in-patient clinic. However, a nested case-control study is feasible.On the base of the experience gained in the AMSP+ study and the fact that we have today only sparse data indicating that routine drug plasma concentration monitoring and/or pharmacogenetic testing in psychiatry are justified to minimize the risk for ADR, we developed a test algorithm named "TDM plus" (TDM plus interaction checks plus pharmacogenetic testing).Pharmacovigilance programs such as the AMSP project (AMSP = Arzneimittelsicherheit in der Psychiatrie) survey psychiatric in-patients in order to collect SADR and to detect new safety signals. Case reports of such SADR are, although anecdotal, valuable to illustrate rare clinical events and sometimes confirm theoretical assumptions of e.g. drug interactions. Seven pharmacovigilance case reports are summarized in this thesis.To provide clinicians with meaningful information on the risk of drug combinations, during the course of this thesis the internet based drug interaction program mediQ.ch (in German) has been developed. Risk estimation is based on published clinical and pharmacological information of single drugs and alimentary products, including adverse drug reaction profiles. Information on risk factors such as renal and hepatic insufficiency and specific genotypes are given. More than 20'000 drug pairs have been described in detail. Over 2000 substances with their metabolic and transport pathways are included and all information is referenced with links to the published scientific literature or other information sources. Medical professionals of more than 100 hospitals and 300 individual practitioners do consult mediQ.ch regularly. Validations with comparisons to other drug interaction programs show good results.Finally, therapeutic drug monitoring, drug interaction programs and pharmacogenetic tests are helpful tools in pharmacovigilance and should, in absence of sufficient routine tests supporting data, be used as proposed in our TDM plus algorithm.RESUMEPour améliorer la sécurité d'emploi des médicaments il est important de mieux comprendre leurs interactions dans le corps des patients. Ensuite le clinicien qui prescrit une pharmacothérapie doit avoir un accès simple à ces informations. Entre autres, cette thèse contribue à mieux connaître les caractéristiques pharmacocinétiques de deux médicaments. Elle examine aussi l'utilisation de trois outils en pharmacovigilance : le monitorage thérapeutique des taux plasmatiques des médicaments (« therapeutic drug monitoring »), un programme informatisé d'estimation du risque de combinaisons médicamenteuses, et enfin des tests pharmacogénétiques.Deux études cliniques pharmacogénétiques ont été conduites dans le cadre habituel de clinique psychiatrique : l'une avec la mirtazapine (antidépresseur), l'autre avec la clozapine (antipsychotique). On a traité 45 patients dépressifs avec de la mirtazapine pendant 8 semaines. L'effet thérapeutique était semblable à celui des études précédentes. Nous avons confirmé l'influence de l'âge et du sexe sur la pharmacocinétique de la mirtazapine et la différence dans les concentrations plasmatiques entre fumeurs et non-fumeurs. Au moyen d'analyses énantiomères sélectives, nous avons pu montrer une influence significative du génotype CYP2D6 sur l'énantiomère S+, principalement responsable de l'effet antidépresseur. Pour la première fois, nous avons trouvé une influence du génotype CYP2B6 sur les taux plasmatiques de la 8-OH-mirtazapine. Par ailleurs, le génotype CYP2B6*6/*6 était associé à une meilleure réponse thérapeutique. Une hypothèse sur les voies métaboliques détaillées de la mirtazapine est proposée. Dans la deuxième étude, 75 patients schizophrènes traités avec de la clozapine ont été examinés pour étudier l'influence des génotypes des iso-enzymes CYP450 et de la protéine de transport ABCB1 sur la pharmacocinétique de cet antipsychotique. Pour la première fois, on a montré in vivo un effet des génotypes CYP2C19 et ABCB1 sur les taux plasmatiques de la clozapine. L'importance du CYP1A2 dans le métabolisme de la clozapine a été confirmée.L'identification de facteurs de risques dans la survenue d'effets secondaire graves permettrait une thérapie plus individualisée et plus sûre. Les effets secondaires graves sont rares. Dans une étude de faisabilité (« nested matched case-control design » = étude avec appariement) nous avons comparé des patients avec effets secondaires graves à des patients-contrôles prenant le même type de médicaments mais sans effets secondaires graves. Des taux plasmatiques supérieurs à 120% de la valeur de référence haute sont associés à un risque avec « odds ratio » significatif de 3.5. Une tendance similaire est apparue pour le génotype du CYP2D6. Le « nested matched case-control design » semble une méthode valide qui présente cependant une difficulté : trouver des patients-contrôles dans le cadre d'une seule clinique psychiatrique. Par contre la conduite d'une « nested case-control study » sans appariement est recommandable.Sur la base de notre expérience de l'étude AMSP+ et le fait que nous disposons que de peux de données justifiant des monitorings de taux plasmatiques et/ou de tests pharmacogénétiques de routine, nous avons développé un test algorithme nommé « TDMplus » (TDM + vérification d'interactions médicamenteuses + tests pharmacogénétique).Des programmes de pharmacovigilances comme celui de l'AMSP (Arzneimittelsicherheit in der Psychiatrie = pharmacovigilance en psychiatrie) collectent les effets secondaires graves chez les patients psychiatriques hospitalisés pour identifier des signaux d'alertes. La publication de certains de ces cas même anecdotiques est précieuse. Elle décrit des événements rares et quelques fois une hypothèse sur le potentiel d'une interaction médicamenteuse peut ainsi être confirmée. Sept publications de cas sont résumées ici.Dans le cadre de cette thèse, on a développé un programme informatisé sur internet (en allemand) - mediQ.ch - pour estimer le potentiel de risques d'une interaction médicamenteuse afin d'offrir en ligne ces informations utiles aux cliniciens. Les estimations de risques sont fondées sur des informations cliniques (y compris les profils d'effets secondaires) et pharmacologiques pour chaque médicament ou substance combinés. Le programme donne aussi des informations sur les facteurs de risques comme l'insuffisance rénale et hépatique et certains génotypes. Actuellement il décrit en détail les interactions potentielles de plus de 20'000 paires de médicaments, et celles de 2000 substances actives avec leurs voies de métabolisation et de transport. Chaque information mentionne sa source d'origine; un lien hypertexte permet d'y accéder. Le programme mediQ.ch est régulièrement consulté par les cliniciens de 100 hôpitaux et par 300 praticiens indépendants. Les premières validations et comparaisons avec d'autres programmes sur les interactions médicamenteuses montrent de bons résultats.En conclusion : le monitorage thérapeutique des médicaments, les programmes informatisés contenant l'information sur le potentiel d'interaction médicamenteuse et les tests pharmacogénétiques sont de précieux outils en pharmacovigilance. Nous proposons de les utiliser en respectant l'algorithme « TDM plus » que nous avons développé.