981 resultados para Trimmed likelihood
Resumo:
BACKGROUND/OBJECTIVES: This study aims to assess whether patent foramen ovale (PFO) closure is superior to medical therapy in preventing recurrence of cryptogenic ischemic stroke or transient ischemic attack (TIA). METHODS: We searched PubMed for randomized trials which compared PFO closure with medical therapy in cryptogenic stroke/TIA using the items: "stroke or cerebrovascular accident or TIA" and "patent foramen ovale or paradoxical embolism" and "trial or study". RESULTS: Among 650 potentially eligible articles, 3 were included including 2303 patients. There was no statistically significant difference between PFO-closure and medical therapy in ischemic stroke recurrence (1.91% vs. 2.94% respectively, OR: 0.64, 95%CI: 0.37-1.10), TIA (2.08% vs. 2.42% respectively, OR: 0.87, 95%CI: 0.50-1.51) and death (0.60% vs. 0.86% respectively, OR: 0.71, 95%CI: 0.28-1.82). In subgroup analysis, there was significant reduction of ischemic strokes in the AMPLATZER PFO Occluder arm vs. medical therapy (1.4% vs. 3.04% respectively, OR: 0.46, 95%CI: 0.21-0.98, relative-risk-reduction: 53.2%, absolute-risk-reduction: 1.6%, number-needed-to-treat: 61.8) but not in the STARFlex device (2.7% vs. 2.8% with medical therapy, OR: 0.93, 95%CI: 0.45-2.11). Compared to medical therapy, the number of patients with new-onset atrial fibrillation (AF) was similar in the AMPLATZER PFO Occluder arm (0.72% vs. 1.28% respectively, OR: 1.81, 95%CI: 0.60-5.42) but higher in the STARFlex device (0.64% vs. 5.14% respectively, OR: 8.30, 95%CI: 2.47-27.84). CONCLUSIONS: This meta-analysis does not support PFO closure for secondary prevention with unselected devices in cryptogenic stroke/TIA. In subgroup analysis, selected closure devices may be superior to medical therapy without increasing the risk of new-onset AF, however. This observation should be confirmed in further trials using inclusion criteria for patients with high likelihood of PFO-related stroke recurrence.
Resumo:
C4 photosynthesis is an adaptation derived from the more common C3 photosynthetic pathway that confers a higher productivity under warm temperature and low atmospheric CO2 concentration [1, 2]. C4 evolution has been seen as a consequence of past atmospheric CO2 decline, such as the abrupt CO2 fall 32-25 million years ago (Mya) [3-6]. This relationship has never been tested rigorously, mainly because of a lack of accurate estimates of divergence times for the different C4 lineages [3]. In this study, we inferred a large phylogenetic tree for the grass family and estimated, through Bayesian molecular dating, the ages of the 17 to 18 independent grass C4 lineages. The first transition from C3 to C4 photosynthesis occurred in the Chloridoideae subfamily, 32.0-25.0 Mya. The link between CO2 decrease and transition to C4 photosynthesis was tested by a novel maximum likelihood approach. We showed that the model incorporating the atmospheric CO2 levels was significantly better than the null model, supporting the importance of CO2 decline on C4 photosynthesis evolvability. This finding is relevant for understanding the origin of C4 photosynthesis in grasses, which is one of the most successful ecological and evolutionary innovations in plant history.
Resumo:
PURPOSE: Investigation of the incidence and distribution of congenital structural cardiac malformations among the offspring of mothers with diabetes type 1 and of the influence of periconceptional glycemic control. METHODS: Multicenter retrospective clinical study, literature review, and meta-analysis. The incidence and pattern of congenital heart disease in the own study population and in the literature on the offspring of type 1 diabetic mothers were compared with the incidence and spectrum of the various cardiovascular defects in the offspring of nondiabetic mothers as registered by EUROCAT Northern Netherlands. Medical records were, in addition, reviewed for HbA(1c) during the 1st trimester. RESULTS: The distribution of congenital heart anomalies in the own diabetic study population was in accordance with the distribution encountered in the literature. This distribution differed considerably from that in the nondiabetic population. Approximately half the cardiovascular defects were conotruncal anomalies. The authors' study demonstrated a remarkable increase in the likelihood of visceral heterotaxia and variants of single ventricle among these patients. As expected, elevated HbA(1c) values during the 1st trimester were associated with offspring fetal cardiovascular defects. CONCLUSION: This study shows an increased likelihood of specific heart anomalies, namely transposition of the great arteries, persistent truncus arteriosus, visceral heterotaxia and single ventricle, among offspring of diabetic mothers. This suggests a profound teratogenic effect at a very early stage in cardiogenesis. The study emphasizes the frequency with which the offspring of diabetes-complicated pregnancies suffer from complex forms of congenital heart disease. Pregnancies with poor 1st-trimester glycemic control are more prone to the presence of fetal heart disease.
Resumo:
Pleistocene glacial and interglacial periods have moulded the evolutionary history of European cold-adapted organisms. The role of the different mountain massifs has, however, not been accurately investigated in the case of high-altitude insect species. Here, we focus on three closely related species of non-flying leaf beetles of the genus Oreina (Coleoptera, Chrysomelidae), which are often found in sympatry within the mountain ranges of Europe. After showing that the species concept as currently applied does not match barcoding results, we show, based on more than 700 sequences from one nuclear and three mitochondrial genes, the role of biogeography in shaping the phylogenetic hypothesis. Dating the phylogeny using an insect molecular clock, we show that the earliest lineages diverged more than 1 Mya and that the main shift in diversification rate occurred between 0.36 and 0.18 Mya. By using a probabilistic approach on the parsimony-based dispersal/vicariance framework (MP-DIVA) as well as a direct likelihood method of state change optimization, we show that the Alps acted as a cross-roads with multiple events of dispersal to and reinvasion from neighbouring mountains. However, the relative importance of vicariance vs. dispersal events on the process of rapid diversification remains difficult to evaluate because of a bias towards overestimation of vicariance in the DIVA algorithm. Parallels are drawn with recent studies of cold-adapted species, although our study reveals novel patterns in diversity and genetic links between European mountains, and highlights the importance of neglected regions, such as the Jura and the Balkanic range.
Resumo:
BACKGROUND AND PURPOSE: Management of brain arteriovenous malformation (bAVM) is controversial. We have analyzed the largest surgical bAVM cohort for outcome. METHODS: Both operated and nonoperated cases were included for analysis. A total of 779 patients with bAVMs were consecutively enrolled between 1989 and 2014. Initial management recommendations were recorded before commencement of treatment. Surgical outcome was prospectively recorded and outcomes assigned at the last follow-up visit using modified Rankin Scale. First, a sensitivity analyses was performed to select a subset of the entire cohort for which the results of surgery could be generalized. Second, from this subset, variables were analyzed for risk of deficit or near miss (intraoperative hemorrhage requiring blood transfusion of ≥2.5 L, hemorrhage in resection bed requiring reoperation, and hemorrhage associated with either digital subtraction angiography or embolization). RESULTS: A total of 7.7% of patients with Spetzler-Ponce classes A and B bAVM had an adverse outcome from surgery leading to a modified Rankin Scale >1. Sensitivity analyses that demonstrated outcome results were not subject to selection bias for Spetzler-Ponce classes A and B bAVMs. Risk factors for adverse outcomes from surgery for these bAVMs include size, presence of deep venous drainage, and eloquent location. Preoperative embolization did not affect the risk of perioperative hemorrhage. CONCLUSIONS: Most of the ruptured and unruptured low and middle-grade bAVMs (Spetzler-Ponce A and B) can be surgically treated with a low risk of permanent morbidity and a high likelihood of preventing future hemorrhage. Our results do not apply to Spetzler-Ponce C bAVMs.
Resumo:
We analyse risk-taking behaviour of banks in the context of spatial competition. Banks mobilise unsecured deposits by offering deposit rates, which they invest either in a prudent or a gambling asset. Limited liability along with high return of a successful gamble induce moral hazard at the bank level. We show that when the market power is low, banks invest in the gambling asset. On the other hand, for sufficiently high levels of market power, all banks choose the prudent asset to invest in. We further show that a merger of two neighboring banks increases the likelihood of prudent behaviour. Finally, introduction of a deposit insurance scheme exacerbates banks’ moral hazard problem.
Resumo:
BACKGROUND: Children with atopic diseases in early life are frequently found with positive IgE tests to peanuts/tree nuts without a history of previous ingestion. We aimed to identify risk factors for reactions to nuts at first introduction. METHODS: A retrospective case-note and database analysis was performed. Recruitment criteria were: patients aged 3-16 yr who had a standardized food challenge to peanut and/or tree nuts due to sensitisation to the peanut/tree nut (positive spIgE or SPT) without previous consumption. A detailed assessment was performed of factors relating to food challenge outcome with univariate and multivariate logistic regression analysis. RESULTS: There were 98 food challenges (47 peanut, 51 tree nut) with 29 positive, 67 negative and 2 inconclusive outcomes. A positive maternal history of allergy and a specific IgE >5 kU/l were strongly associated with a significantly increased risk of a positive food challenge (OR 3.73; 95% CI 1.31-10.59; p = 0.013 and OR 3.35; 95% CI 1.23-9.11; p = 0.007, respectively). Adjusting for age, a three year-old with these criteria has a 67% probability of a positive challenge. There was no significant association between types of peanut/tree nut, other food allergies, atopic conditions or severity of previous food reactions and positive challenges. CONCLUSIONS: We have demonstrated an association between the presence of maternal atopic history and a specific IgE >5 kU/l, with a significant increase in the likelihood of a positive food challenge. Although requiring further prospective validation these easily identifiable components should be considered when deciding the need for a challenge.
Resumo:
This paper develops a theoretical model for the demand of alcohol where intensity and frequency of consumption are separate choices made by individuals in order to maximize their utility. While distinguishing between intensity and frequency of consumption may be unimportant for many goods, this is clearly not the case with alcohol where the likelihood of harm depends not only on the total consumed but also on the pattern of use. The results from the theoretical model are applied to data from rural Australia in order to investigate the factors that affect the patterns of alcohol use for this population group. This research can play an important role in informing policies by identifying those factors which influence preferences for patterns of risky alcohol use and those groups and communities who are most at risk of harm.
Resumo:
This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.
Resumo:
The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.
Resumo:
This work is focused on the development of a methodology for the use of chemical characteristic of tire traces to help answer the following question: "Is the offending tire at the origin of the trace found on the crime scene?". This methodology goes from the trace sampling on the road to statistical analysis of its chemical characteristics. Knowledge about the composition and manufacture of tread tires as well as a review of instrumental techniques used for the analysis of polymeric materials were studied to select, as an ansi vi cal technique for this research, pyrolysis coupled to a gas Chromatograph with a mass spectrometry detector (Py-GC/MS). An analytical method was developed and optimized to obtain the lowest variability between replicates of the same sample. Within-variability of the tread was evaluated regarding width and circumference with several samples taken from twelve tires of different brands and/or models. The variability within each of the treads (within-variability) and between the treads (between-variability) could be quantified. Different statistical methods have shown that within-variability is lower than between-variability, which helped differentiate these tires. Ten tire traces were produced with tires of different brands and/or models by braking tests. These traces have been adequately sampled using sheets of gelatine. Particles of each trace were analysed using the same methodology as for the tires at their origin. The general chemical profile of a trace or of a tire has been characterized by eighty-six compounds. Based on a statistical comparison of the chemical profiles obtained, it has been shown that a tire trace is not differentiable from the tire at its origin but is generally differentiable from tires that are not at its origin. Thereafter, a sample containing sixty tires was analysed to assess the discrimination potential of the developed methodology. The statistical results showed that most of the tires of different brands and models are differentiable. However, tires of the same brand and model with identical characteristics, such as country of manufacture, size and DOT number, are not differentiable. A model, based on a likelihood ratio approach, was chosen to evaluate the results of the comparisons between the chemical profiles of the traces and tires. The methodology developed was finally blindly tested using three simulated scenarios. Each scenario involved a trace of an unknown tire as well as two tires possibly at its origin. The correct results for the three scenarios were used to validate the developed methodology. The different steps of this work were useful to collect the required information to test and validate the underlying assumption that it is possible to help determine if an offending tire » or is not at the origin of a trace, by means of a statistical comparison of their chemical profile. This aid was formalized by a measure of the probative value of the evidence, which is represented by the chemical profile of the trace of the tire. - Ce travail s'est proposé de développer une méthodologie pour l'exploitation des caractéristiques chimiques des traces de pneumatiques dans le but d'aider à répondre à la question suivante : «Est-ce que le pneumatique incriminé est ou n'est pas à l'origine de la trace relevée sur les lieux ? ». Cette méthodologie s'est intéressée du prélèvement de la trace de pneumatique sur la chaussée à l'exploitation statistique de ses caractéristiques chimiques. L'acquisition de connaissances sur la composition et la fabrication de la bande de roulement des pneumatiques ainsi que la revue de techniques instrumentales utilisées pour l'analyse de matériaux polymériques ont permis de choisir, comme technique analytique pour la présente recherche, la pyrolyse couplée à un chromatographe en phase gazeuse avec un détecteur de spectrométrie de masse (Py-GC/MS). Une méthode analytique a été développée et optimisée afin d'obtenir la plus faible variabilité entre les réplicas d'un même échantillon. L'évaluation de l'intravariabilité de la bande de roulement a été entreprise dans sa largeur et sa circonférence à l'aide de plusieurs prélèvements effectués sur douze pneumatiques de marques et/ou modèles différents. La variabilité au sein de chacune des bandes de roulement (intravariabilité) ainsi qu'entre les bandes de roulement considérées (intervariabilité) a pu être quantifiée. Les différentes méthodes statistiques appliquées ont montré que l'intravariabilité est plus faible que l'intervariabilité, ce qui a permis de différencier ces pneumatiques. Dix traces de pneumatiques ont été produites à l'aide de pneumatiques de marques et/ou modèles différents en effectuant des tests de freinage. Ces traces ont pu être adéquatement prélevées à l'aide de feuilles de gélatine. Des particules de chaque trace ont été analysées selon la même méthodologie que pour les pneumatiques à leur origine. Le profil chimique général d'une trace de pneumatique ou d'un pneumatique a été caractérisé à l'aide de huitante-six composés. Sur la base de la comparaison statistique des profils chimiques obtenus, il a pu être montré qu'une trace de pneumatique n'est pas différenciable du pneumatique à son origine mais est, généralement, différenciable des pneumatiques qui ne sont pas à son origine. Par la suite, un échantillonnage comprenant soixante pneumatiques a été analysé afin d'évaluer le potentiel de discrimination de la méthodologie développée. Les méthodes statistiques appliquées ont mis en évidence que des pneumatiques de marques et modèles différents sont, majoritairement, différenciables entre eux. La méthodologie développée présente ainsi un bon potentiel de discrimination. Toutefois, des pneumatiques de la même marque et du même modèle qui présentent des caractéristiques PTD (i.e. pays de fabrication, taille et numéro DOT) identiques ne sont pas différenciables. Un modèle d'évaluation, basé sur une approche dite du likelihood ratio, a été adopté pour apporter une signification au résultat des comparaisons entre les profils chimiques des traces et des pneumatiques. La méthodologie mise en place a finalement été testée à l'aveugle à l'aide de la simulation de trois scénarios. Chaque scénario impliquait une trace de pneumatique inconnue et deux pneumatiques suspectés d'être à l'origine de cette trace. Les résultats corrects obtenus pour les trois scénarios ont permis de valider la méthodologie développée. Les différentes étapes de ce travail ont permis d'acquérir les informations nécessaires au test et à la validation de l'hypothèse fondamentale selon laquelle il est possible d'aider à déterminer si un pneumatique incriminé est ou n'est pas à l'origine d'une trace, par le biais d'une comparaison statistique de leur profil chimique. Cette aide a été formalisée par une mesure de la force probante de l'indice, qui est représenté par le profil chimique de la trace de pneumatique.
Resumo:
This paper is motivated by the recent interest in the use of Bayesian VARs for forecasting, even in cases where the number of dependent variables is large. In such cases, factor methods have been traditionally used but recent work using a particular prior suggests that Bayesian VAR methods can forecast better. In this paper, we consider a range of alternative priors which have been used with small VARs, discuss the issues which arise when they are used with medium and large VARs and examine their forecast performance using a US macroeconomic data set containing 168 variables. We nd that Bayesian VARs do tend to forecast better than factor methods and provide an extensive comparison of the strengths and weaknesses of various approaches. Our empirical results show the importance of using forecast metrics which use the entire predictive density, instead of using only point forecasts.
Resumo:
Spatial heterogeneity, spatial dependence and spatial scale constitute key features of spatial analysis of housing markets. However, the common practice of modelling spatial dependence as being generated by spatial interactions through a known spatial weights matrix is often not satisfactory. While existing estimators of spatial weights matrices are based on repeat sales or panel data, this paper takes this approach to a cross-section setting. Specifically, based on an a priori definition of housing submarkets and the assumption of a multifactor model, we develop maximum likelihood methodology to estimate hedonic models that facilitate understanding of both spatial heterogeneity and spatial interactions. The methodology, based on statistical orthogonal factor analysis, is applied to the urban housing market of Aveiro, Portugal at two different spatial scales.
Resumo:
We propose a non-equidistant Q rate matrix formula and an adaptive numerical algorithm for a continuous time Markov chain to approximate jump-diffusions with affine or non-affine functional specifications. Our approach also accommodates state-dependent jump intensity and jump distribution, a flexibility that is very hard to achieve with other numerical methods. The Kolmogorov-Smirnov test shows that the proposed Markov chain transition density converges to the one given by the likelihood expansion formula as in Ait-Sahalia (2008). We provide numerical examples for European stock option pricing in Black and Scholes (1973), Merton (1976) and Kou (2002).
Resumo:
The paper explores the macroeconomic consequences of fiscal consolidations whose timing and composition are uncertain. Drawing on the evidence in Alesina and Ardagna (2010), we emphasize whether or not the fiscal consolidation is driven by tax rises or expenditure cuts. We find that the composition of the fiscal consolidation, its duration, the monetary policy stance, the level of government debt and expectations over the likelihood and composition of fiscal consolidations all matter in determining the extent to which a given consolidation is expansionary and/or successful in stabilizing government debt.