299 resultados para Trimmed likelihood
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model.
Resumo:
BACKGROUND: Poor long-term adherence is an important cause of uncontrolled hypertension. We examined whether monitoring drug adherence with an electronic system improves long-term blood pressure (BP) control in hypertensive patients followed by general practitioners (GPs). METHODS: A pragmatic cluster randomised controlled study was conducted over one year in community pharmacists/GPs' networks randomly assigned either to usual care (UC) where drugs were dispensed as usual, or to intervention (INT) group where drug adherence could be monitored with an electronic system (Medication Event Monitoring System). No therapy change was allowed during the first 2 months in both groups. Thereafter, GPs could modify therapy and use electronic monitors freely in the INT group. The primary outcome was a target office BP<140/90 mmHg. RESULTS: Sixty-eight treated uncontrolled hypertensive patients (UC: 34; INT: 34) were enrolled. Over the 12-month period, the likelihood of reaching the target BP was higher in the INT group compared to the UC group (p<0.05). At 4 months, 38% in the INT group reached the target BP vs. 12% in the UC group (p<0.05), and 21% vs. 9% at 12 months (p: ns). Multivariate analyses, taking account of baseline characteristics, therapy modification during follow-up, and clustering effects by network, indicate that being allocated to the INT group was associated with a greater odds of reaching the target BP at 4 months (p<0.01) and at 12 months (p=0.051). CONCLUSION: GPs monitoring drug adherence in collaboration with pharmacists achieved a better BP control in hypertensive patients, although the impact of monitoring decreased with time.
Resumo:
Molecular phylogeny of soricid shrews (Soricidae, Eulipotyphla, Mammalia) based on 1140 bp mitochondrial cytochrome b gene (cytb) sequences was inferred by the maximum likelihood (ML) method. All 13 genera of extant Soricinae and two genera of Crocidurinae were included in the analyses. Anourosorex was phylogenetically distant from the main groupings within Soricinae and Crocidurinae in the ML tree. Thus, it could not be determined to which subfamily Anourosorex should be assigned: Soricinae, Crocidurinae or a new subfamily. Soricinae (excluding Anourosorex) should be divided into four tribes: Neomyini, Notiosoricini, Soricini and Blarinini. However, monophyly of Blarinini was not robust in the present data set. Also, branching orders among tribes of Soricinae and those among genera of Neomyini could not be determined because of insufficient phylogenetic information of the cytb sequences. For water shrews of Neomyini (Chimarrogale, Nectogale and Neomys), monophyly of Neomys and the Chimarrogale-Nectogale group could not be verified, which implies the possibility of multiple origins for the semi-aquatic mode of living among taxa within Neomyini. Episoriculus may contain several separate genera. Blarinella was included in Blarinini not Soricini, based on the cytb sequences, but the confidence level was rather low; hence more phylogenetic information is needed to determine its phylogenetic position. Furthermore, some specific problems of taxonomy of soricid shrews were clarified, for example phylogeny of local populations of Notiosorex crawfordi, Chimarrogale himalayica and Crocidura attenuata.
Resumo:
Predictive groundwater modeling requires accurate information about aquifer characteristics. Geophysical imaging is a powerful tool for delineating aquifer properties at an appropriate scale and resolution, but it suffers from problems of ambiguity. One way to overcome such limitations is to adopt a simultaneous multitechnique inversion strategy. We have developed a methodology for aquifer characterization based on structural joint inversion of multiple geophysical data sets followed by clustering to form zones and subsequent inversion for zonal parameters. Joint inversions based on cross-gradient structural constraints require less restrictive assumptions than, say, applying predefined petro-physical relationships and generally yield superior results. This approach has, for the first time, been applied to three geophysical data types in three dimensions. A classification scheme using maximum likelihood estimation is used to determine the parameters of a Gaussian mixture model that defines zonal geometries from joint-inversion tomograms. The resulting zones are used to estimate representative geophysical parameters of each zone, which are then used for field-scale petrophysical analysis. A synthetic study demonstrated how joint inversion of seismic and radar traveltimes and electrical resistance tomography (ERT) data greatly reduces misclassification of zones (down from 21.3% to 3.7%) and improves the accuracy of retrieved zonal parameters (from 1.8% to 0.3%) compared to individual inversions. We applied our scheme to a data set collected in northeastern Switzerland to delineate lithologic subunits within a gravel aquifer. The inversion models resolve three principal subhorizontal units along with some important 3D heterogeneity. Petro-physical analysis of the zonal parameters indicated approximately 30% variation in porosity within the gravel aquifer and an increasing fraction of finer sediments with depth.
Resumo:
BACKGROUND: The Marburg Heart Score (MHS) aims to assist GPs in safely ruling out coronary heart disease (CHD) in patients presenting with chest pain, and to guide management decisions. AIM: To investigate the diagnostic accuracy of the MHS in an independent sample and to evaluate the generalisability to new patients. DESIGN AND SETTING: Cross-sectional diagnostic study with delayed-type reference standard in general practice in Hesse, Germany. METHOD: Fifty-six German GPs recruited 844 males and females aged ≥ 35 years, presenting between July 2009 and February 2010 with chest pain. Baseline data included the items of the MHS. Data on the subsequent course of chest pain, investigations, hospitalisations, and medication were collected over 6 months and were reviewed by an independent expert panel. CHD was the reference condition. Measures of diagnostic accuracy included the area under the receiver operating characteristic curve (AUC), sensitivity, specificity, likelihood ratios, and predictive values. RESULTS: The AUC was 0.84 (95% confidence interval [CI] = 0.80 to 0.88). For a cut-off value of 3, the MHS showed a sensitivity of 89.1% (95% CI = 81.1% to 94.0%), a specificity of 63.5% (95% CI = 60.0% to 66.9%), a positive predictive value of 23.3% (95% CI = 19.2% to 28.0%), and a negative predictive value of 97.9% (95% CI = 96.2% to 98.9%). CONCLUSION: Considering the diagnostic accuracy of the MHS, its generalisability, and ease of application, its use in clinical practice is recommended.
Resumo:
BACKGROUND: Legionella species cause severe forms of pneumonia with high mortality and complication rates. Accurate clinical predictors to assess the likelihood of Legionella community-acquired pneumonia (CAP) in patients presenting to the emergency department are lacking. METHODS: We retrospectively compared clinical and laboratory data of 82 consecutive patients with Legionella CAP with 368 consecutive patients with non-Legionella CAP included in two studies at the same institution. RESULTS: In multivariate logistic regression analysis we identified six parameters, namely high body temperature (OR 1.67, p < 0.0001), absence of sputum production (OR 3.67, p < 0.0001), low serum sodium concentrations (OR 0.89, p = 0.011), high levels of lactate dehydrogenase (OR 1.003, p = 0.007) and C-reactive protein (OR 1.006, p < 0.0001) and low platelet counts (OR 0.991, p < 0.0001), as independent predictors of Legionella CAP. Using optimal cut off values of these six parameters, we calculated a diagnostic score for Legionella CAP. The median score was significantly higher in Legionella CAP as compared to patients without Legionella (4 (IQR 3-4) vs 2 (IQR 1-2), p < 0.0001) with a respective odds ratio of 3.34 (95%CI 2.57-4.33, p < 0.0001). Receiver operating characteristics showed a high diagnostic accuracy of this diagnostic score (AUC 0.86 (95%CI 0.81-0.90), which was better as compared to each parameter alone. Of the 191 patients (42%) with a score of 0 or 1 point, only 3% had Legionella pneumonia. Conversely, of the 73 patients (16%) with > or =4 points, 66% of patients had Legionella CAP. CONCLUSION: Six clinical and laboratory parameters embedded in a simple diagnostic score accurately identified patients with Legionella CAP. If validated in future studies, this score might aid in the management of suspected Legionella CAP.
Resumo:
Robust estimators for accelerated failure time models with asymmetric (or symmetric) error distribution and censored observations are proposed. It is assumed that the error model belongs to a log-location-scale family of distributions and that the mean response is the parameter of interest. Since scale is a main component of mean, scale is not treated as a nuisance parameter. A three steps procedure is proposed. In the first step, an initial high breakdown point S estimate is computed. In the second step, observations that are unlikely under the estimated model are rejected or down weighted. Finally, a weighted maximum likelihood estimate is computed. To define the estimates, functions of censored residuals are replaced by their estimated conditional expectation given that the response is larger than the observed censored value. The rejection rule in the second step is based on an adaptive cut-off that, asymptotically, does not reject any observation when the data are generat ed according to the model. Therefore, the final estimate attains full efficiency at the model, with respect to the maximum likelihood estimate, while maintaining the breakdown point of the initial estimator. Asymptotic results are provided. The new procedure is evaluated with the help of Monte Carlo simulations. Two examples with real data are discussed.
Resumo:
PURPOSE: The objective of this study was to investigate the effects of weather, rank, and home advantage on international football match results and scores in the Gulf Cooperation Council (GCC) region. METHODS: Football matches (n = 2008) in six GCC countries were analyzed. To determine the weather influence on the likelihood of favorable outcome and goal difference, generalized linear model with a logit link function and multiple regression analysis were performed. RESULTS: In the GCC region, home teams tend to have greater likelihood of a favorable outcome (P < 0.001) and higher goal difference (P < 0.001). Temperature difference was identified as a significant explanatory variable when used independently (P < 0.001) or after adjustment for home advantage and team ranking (P < 0.001). The likelihood of favorable outcome for GCC teams increases by 3% for every 1-unit increase in temperature difference. After inclusion of interaction with opposition, this advantage remains significant only when playing against non-GCC opponents. While home advantage increased the odds of favorable outcome (P < 0.001) and goal difference (P < 0.001) after inclusion of interaction term, the likelihood of favorable outcome for a GCC team decreased (P < 0.001) when playing against a stronger opponent. Finally, the temperature and wet bulb globe temperature approximation were found as better indicators of the effect of environmental conditions than absolute and relative humidity or heat index on match outcomes. CONCLUSIONS: In GCC region, higher temperature increased the likelihood of a favorable outcome when playing against non-GCC teams. However, international ranking should be considered because an opponent with a higher rank reduced, but did not eliminate, the likelihood of a favorable outcome.
Resumo:
Selectome (http://selectome.unil.ch/) is a database of positive selection, based on a branch-site likelihood test. This model estimates the number of nonsynonymous substitutions (dN) and synonymous substitutions (dS) to evaluate the variation in selective pressure (dN/dS ratio) over branches and over sites. Since the original release of Selectome, we have benchmarked and implemented a thorough quality control procedure on multiple sequence alignments, aiming to provide minimum false-positive results. We have also improved the computational efficiency of the branch-site test implementation, allowing larger data sets and more frequent updates. Release 6 of Selectome includes all gene trees from Ensembl for Primates and Glires, as well as a large set of vertebrate gene trees. A total of 6810 gene trees have some evidence of positive selection. Finally, the web interface has been improved to be more responsive and to facilitate searches and browsing.
Resumo:
Background: The Pulmonary Embolism Rule-out Criteria (PERC) rule is a clinical diagnostic rule designed to exclude pulmonary embolism (PE) without further testing. We sought to externally validate the diagnostic performance of the PERC rule alone and combined with clinical probability assessment based on the revised Geneva score. Methods: The PERC rule was applied retrospectively to consecutive patients who presented with a clinical suspicion of PE to six emergency departments, and who were enrolled in a randomized trial of PE diagnosis. Patients who met all eight PERC criteria [PERC(-)] were considered to be at a very low risk for PE. We calculated the prevalence of PE among PERC(-) patients according to their clinical pretest probability of PE. We estimated the negative likelihood ratio of the PERC rule to predict PE. Results: Among 1675 patients, the prevalence of PE was 21.3%. Overall, 13.2% of patients were PERC(-). The prevalence of PE was 5.4% [95% confidence interval (CI): 3.1-9.3%] among PERC(-) patients overall and 6.4% (95% CI: 3.7-10.8%) among those PERC(-) patients with a low clinical pretest probability of PE. The PERC rule had a negative likelihood ratio of 0.70 (95% CI: 0.67-0.73) for predicting PE overall, and 0.63 (95% CI: 0.38-1.06) in low-risk patients. Conclusions: Our results suggest that the PERC rule alone or even when combined with the revised Geneva score cannot safely identify very low risk patients in whom PE can be ruled out without additional testing, at least in populations with a relatively high prevalence of PE.
Resumo:
We examined phylogenetic relationships among six species representing three subfamilies, Glirinae, Graphiurinae and Leithiinae with sequences from three nuclear protein-coding genes (apolipoprotein B, APOB; interphotoreceptor retinoid-binding protein, IRBP; recombination-activating gene 1, RAG1). Phylogenetic trees reconstructed from maximum-parsimony (MP), maximum-likelihood (ML) and Bayesian-inference (BI) analyses showed the monophyly of Glirinae (Glis and Glirulus) and Leithiinae (Dryomys, Eliomys and Muscardinus) with strong support, although the branch length maintaining this relationship was very short, implying rapid diversification among the three subfamilies. Divergence time estimates were calculated from ML (local clock model) and Bayesian-dating method using a calibration point of 25 Myr (million years) ago for the divergence between Glis and Glirulus, and 55 Myr ago for the split between lineages of Gliridae and Sciuridae on the basis of fossil records. The results showed that each lineage of Graphiuros, Glis, Glirulus and Muscardinus dates from the Late Oligocene to the Early Miocene period, which is mostly in agreement with fossil records. Taking into account that warm climate harbouring a glirid-favoured forest dominated from Europe to Asia during this period, it is considered that this warm environment triggered the prosperity of the glirid species through the rapid diversification. Glirulus japonicas is suggested to be a relict of this ancient diversification during the warm period.
Resumo:
Research projects aimed at proposing fingerprint statistical models based on the likelihood ratio framework have shown that low quality finger impressions left on crime scenes may have significant evidential value. These impressions are currently either not recovered, considered to be of no value when first analyzed by fingerprint examiners, or lead to inconclusive results when compared to control prints. There are growing concerns within the fingerprint community that recovering and examining these low quality impressions will result in a significant increase of the workload of fingerprint units and ultimately of the number of backlogged cases. This study was designed to measure the number of impressions currently not recovered or not considered for examination, and to assess the usefulness of these impressions in terms of the number of additional detections that would result from their examination.
Resumo:
Cardiovascular risk assessment might be improved with the addition of emerging, new tests derived from atherosclerosis imaging, laboratory tests or functional tests. This article reviews relative risk, odds ratios, receiver-operating curves, posttest risk calculations based on likelihood ratios, the net reclassification improvement and integrated discrimination. This serves to determine whether a new test has an added clinical value on top of conventional risk testing and how this can be verified statistically. Two clinically meaningful examples serve to illustrate novel approaches. This work serves as a review and basic work for the development of new guidelines on cardiovascular risk prediction, taking into account emerging tests, to be proposed by members of the 'Taskforce on Vascular Risk Prediction' under the auspices of the Working Group 'Swiss Atherosclerosis' of the Swiss Society of Cardiology in the future.
Resumo:
BACKGROUND: Evaluation of syncope remains often unstructured. The aim of the study was to assess the effectiveness of a standardized protocol designed to improve the diagnosis of syncope. METHODS: Consecutive patients with syncope presenting to the emergency departments of two primary and tertiary care hospitals over a period of 18 months underwent a two-phase evaluation including: 1) noninvasive assessment (phase I); and 2) specialized tests (phase II), if syncope remained unexplained after phase I. During phase II, the evaluation strategy was alternately left to physicians in charge of patients (control), or guided by a standardized protocol relying on cardiac status and frequency of events (intervention). The primary outcomes were the diagnostic yield of each phase, and the impact of the intervention (phase II) measured by multivariable analysis. RESULTS: Among 1725 patients with syncope, 1579 (92%) entered phase I which permitted to establish a diagnosis in 1061 (67%) of them, including mainly reflex causes and orthostatic hypotension. Five-hundred-eighteen patients (33%) were considered as having unexplained syncope and 363 (70%) entered phase II. A cause for syncope was found in 67 (38%) of 174 patients during intervention periods, compared to 18 (9%) of 189 during control (p<0.001). Compared to control periods, intervention permitted diagnosing more cardiac (8%, vs 3%, p=0.04) and reflex syncope (25% vs 6%, p<0.001), and increased the odds of identifying a cause for syncope by a factor of 4.5 (95% CI: 2.6-8.7, p<0.001). Overall, adding the diagnostic yield obtained during phase I and phase II (intervention periods) permitted establishing the cause of syncope in 76% of patients. CONCLUSION: Application of a standardized diagnostic protocol in patients with syncope improved the likelihood of identifying a cause for this symptom. Future trials should assess the efficacy of diagnosis-specific therapy.