983 resultados para Penalized likelihood
Resumo:
OBJECTIVE: We aimed to create an index to stratify cryptogenic stroke (CS) patients with patent foramen ovale (PFO) by their likelihood that the stroke was related to their PFO. METHODS: Using data from 12 component studies, we used generalized linear mixed models to predict the presence of PFO among patients with CS, and derive a simple index to stratify patients with CS. We estimated the stratum-specific PFO-attributable fraction and stratum-specific stroke/TIA recurrence rates. RESULTS: Variables associated with a PFO in CS patients included younger age, the presence of a cortical stroke on neuroimaging, and the absence of these factors: diabetes, hypertension, smoking, and prior stroke or TIA. The 10-point Risk of Paradoxical Embolism score is calculated from these variables so that the youngest patients with superficial strokes and without vascular risk factors have the highest score. PFO prevalence increased from 23% (95% confidence interval [CI]: 19%-26%) in those with 0 to 3 points to 73% (95% CI: 66%-79%) in those with 9 or 10 points, corresponding to attributable fraction estimates of approximately 0% to 90%. Kaplan-Meier estimated stroke/TIA 2-year recurrence rates decreased from 20% (95% CI: 12%-28%) in the lowest Risk of Paradoxical Embolism score stratum to 2% (95% CI: 0%-4%) in the highest. CONCLUSION: Clinical characteristics identify CS patients who vary markedly in PFO prevalence, reflecting clinically important variation in the probability that a discovered PFO is likely to be stroke-related vs incidental. Patients in strata more likely to have stroke-related PFOs have lower recurrence risk.
Resumo:
Background: Network reconstructions at the cell level are a major development in Systems Biology. However, we are far from fully exploiting its potentialities. Often, the incremental complexity of the pursued systems overrides experimental capabilities, or increasingly sophisticated protocols are underutilized to merely refine confidence levels of already established interactions. For metabolic networks, the currently employed confidence scoring system rates reactions discretely according to nested categories of experimental evidence or model-based likelihood. Results: Here, we propose a complementary network-based scoring system that exploits the statistical regularities of a metabolic network as a bipartite graph. As an illustration, we apply it to the metabolism of Escherichia coli. The model is adjusted to the observations to derive connection probabilities between individual metabolite-reaction pairs and, after validation, to assess the reliability of each reaction in probabilistic terms. This network-based scoring system uncovers very specific reactions that could be functionally or evolutionary important, identifies prominent experimental targets, and enables further confirmation of modeling results. Conclusions: We foresee a wide range of potential applications at different sub-cellular or supra-cellular levels of biological interactions given the natural bipartivity of many biological networks.
Resumo:
In alcohol epidemiology surveys, there is a tradition of measuring alcohol-related consequences using respondents' attribution of alcohol as the cause. The authors aimed to compare the prevalence and frequency of self-attributed consequences to consequences without self-attribution using alcohol-attributable fractions (AAF). In 2007, a total of 7,174 Swiss school students aged 13-16 years reported the numbers of 6 alcohol-related adverse consequences (e.g., fights, injuries) they had incurred in the past 12 months. Consequences were measured with and without attribution of alcohol as the cause. The alcohol-use measures were frequency and volume of drinking in the past 12 months and number of risky single-occasion (> or =5 drinks) drinking episodes in the past 30 days. Attributable fractions were derived from logistic (> or =1 incident) and Poisson (number of incidents) regression analyses. Although relative risk estimates were higher when alcohol-attributed consequences were compared with nonattributed consequences, the use of AAFs resulted in more alcohol-related consequences (10,422 self-attributed consequences vs. 24,520 nonattributed consequences determined by means of AAFs). The likelihood of underreporting was higher among drinkers with intermediate frequencies than among either rare drinkers or frequent drinkers. Therefore, the extent of alcohol-related adverse consequences among adolescents may be underestimated when using self-attributed consequences, because of differential attribution processes, especially among infrequent drinkers.
Resumo:
The identity [r]evolution is happening. Who are you, who am I in the information society? In recent years, the convergence of several factors - technological, political, economic - has accelerated a fundamental change in our networked world. On a technological level, information becomes easier to gather, to store, to exchange and to process. The belief that more information brings more security has been a strong political driver to promote information gathering since September 11. Profiling intends to transform information into knowledge in order to anticipate one's behaviour, or needs, or preferences. It can lead to categorizations according to some specific risk criteria, for example, or to direct and personalized marketing. As a consequence, new forms of identities appear. They are not necessarily related to our names anymore. They are based on information, on traces that we leave when we act or interact, when we go somewhere or just stay in one place, or even sometimes when we make a choice. They are related to the SIM cards of our mobile phones, to our credit card numbers, to the pseudonyms that we use on the Internet, to our email addresses, to the IP addresses of our computers, to our profiles... Like traditional identities, these new forms of identities can allow us to distinguish an individual within a group of people, or describe this person as belonging to a community or a category. How far have we moved through this process? The identity [r]evolution is already becoming part of our daily lives. People are eager to share information with their "friends" in social networks like Facebook, in chat rooms, or in Second Life. Customers take advantage of the numerous bonus cards that are made available. Video surveillance is becoming the rule. In several countries, traditional ID documents are being replaced by biometric passports with RFID technologies. This raises several privacy issues and might actually even result in changing the perception of the concept of privacy itself, in particular by the younger generation. In the information society, our (partial) identities become the illusory masks that we choose -or that we are assigned- to interplay and communicate with each other. Rights, obligations, responsibilities, even reputation are increasingly associated with these masks. On the one hand, these masks become the key to access restricted information and to use services. On the other hand, in case of a fraud or negative reputation, the owner of such a mask can be penalized: doors remain closed, access to services is denied. Hence the current preoccupying growth of impersonation, identity-theft and other identity-related crimes. Where is the path of the identity [r]evolution leading us? The booklet is giving a glance on possible scenarios in the field of identity.
Resumo:
BACKGROUND: Only 25% of IVF transfer cycles lead to a clinical pregnancy, calling for continued technical progress but also more in depth analysis of patients' individual characteristics. The interleukin-1 (IL-1) system and matrix metalloproteinases (MMPs) are strongly implicated in embryo implantation. The genes coding for IL-1Ra (gene symbol IL-1RN), IL-1beta, MMP2 and MMP9 bear functional polymorphisms. We analysed the maternal genetic profile at these polymorphic sites in IVF patients, to determine possible correlations with IVF outcome. METHODS: One hundred and sixty women undergoing an IVF cycle were enrolled and a buccal smear was obtained. The presence of IL-1RN variable number of tandem repeats and IL-1B + 3953, MMP2-1306 and MMP9-1562 single nucleotide substitutions were determined. Patients were divided into pregnancy failures (119), biochemical pregnancies (8) and clinical pregnancies (33). RESULTS: There was a 40% decrease in IL-1RN*2 allele frequency (P = 0.024) and a 45% decrease in IL-1RN*2 carrier status in the clinical pregnancy group as compared to the pregnancy failure group (P = 0.017). This decrease was still statistically significant after a multivariate logistic regression analysis. The likelihood of a clinical pregnancy was decreased accordingly in IL-1RN*2 carriers: odds ratio = 0.349, 95% confidence interval = 0.2-0.8, P = 0.017. The IL-1B, MMP2 and MMP9 polymorphisms showed no correlation with IVF outcome. CONCLUSIONS: IL-1RN*2 allele carriage is associated with a poor prognosis of achieving a pregnancy after IVF.
Resumo:
[cat] Utilitzant l’enquesta REFLEX/HEGESCO, aquest article explora la probabilitat de desajustament entre educació i treball a l’Europa de l’Est i Central. Classifiquem els països en dos grups segons la transparència dels títols educatius al mercat de treball. Polònia, la República Txeca i Eslovènia formen el grup amb més transparència, i Hongria, Lituània i Estònia formen el grup amb més opacitat. Analitzem tres tipus de desajustaments: el vertical (infra‐, sobre‐educació), l’horitzontal (desajustament del camp d’estudi) i el desajust en habilitats. Focalitzem l’anàlisi en l’efecte dels camps d’estudi i les competències dels individus en el desajustament del mercat laboral en aquests països. Els resultats mostren importants diferències entre els dos grups de països estudiats.
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model.
Resumo:
BACKGROUND: Poor long-term adherence is an important cause of uncontrolled hypertension. We examined whether monitoring drug adherence with an electronic system improves long-term blood pressure (BP) control in hypertensive patients followed by general practitioners (GPs). METHODS: A pragmatic cluster randomised controlled study was conducted over one year in community pharmacists/GPs' networks randomly assigned either to usual care (UC) where drugs were dispensed as usual, or to intervention (INT) group where drug adherence could be monitored with an electronic system (Medication Event Monitoring System). No therapy change was allowed during the first 2 months in both groups. Thereafter, GPs could modify therapy and use electronic monitors freely in the INT group. The primary outcome was a target office BP<140/90 mmHg. RESULTS: Sixty-eight treated uncontrolled hypertensive patients (UC: 34; INT: 34) were enrolled. Over the 12-month period, the likelihood of reaching the target BP was higher in the INT group compared to the UC group (p<0.05). At 4 months, 38% in the INT group reached the target BP vs. 12% in the UC group (p<0.05), and 21% vs. 9% at 12 months (p: ns). Multivariate analyses, taking account of baseline characteristics, therapy modification during follow-up, and clustering effects by network, indicate that being allocated to the INT group was associated with a greater odds of reaching the target BP at 4 months (p<0.01) and at 12 months (p=0.051). CONCLUSION: GPs monitoring drug adherence in collaboration with pharmacists achieved a better BP control in hypertensive patients, although the impact of monitoring decreased with time.
Resumo:
This report, the Full Report, is the culmination of the Task Force’s responsibilities as set out in Executive Order 5, dated October 30, 2007. The Executive Order specifies a number of goals and report requirements.There is a commonly held perception that the use of detention may serve as a deterrent to future delinquency. Data in this report reflect that approximately 40% of youth detained in 2006 were re-detained in 2006. Research conducted by national experts indicates that, particularly for low risk/low level offenders, that the use of detention is not neutral, and may increase the likelihood of recidivism. Comparable data for Iowa are not available (national data studied for this report provide level of risk, but risk level related to detention is not presently available for Iowa). The Task Force finds no evidence suggesting that recidivism levels (as related to detention risk) in Iowa should be different than found in other states. Data in this report also suggest that detention is one of the juvenile justice system’s more costly sanctions ($257 - $340 per day). Other sites and local jurisdictions have been able to redirect savings from the reduced use of juvenile detention to support less costly, community-based detention alternatives without compromising public safety.
Resumo:
Molecular phylogeny of soricid shrews (Soricidae, Eulipotyphla, Mammalia) based on 1140 bp mitochondrial cytochrome b gene (cytb) sequences was inferred by the maximum likelihood (ML) method. All 13 genera of extant Soricinae and two genera of Crocidurinae were included in the analyses. Anourosorex was phylogenetically distant from the main groupings within Soricinae and Crocidurinae in the ML tree. Thus, it could not be determined to which subfamily Anourosorex should be assigned: Soricinae, Crocidurinae or a new subfamily. Soricinae (excluding Anourosorex) should be divided into four tribes: Neomyini, Notiosoricini, Soricini and Blarinini. However, monophyly of Blarinini was not robust in the present data set. Also, branching orders among tribes of Soricinae and those among genera of Neomyini could not be determined because of insufficient phylogenetic information of the cytb sequences. For water shrews of Neomyini (Chimarrogale, Nectogale and Neomys), monophyly of Neomys and the Chimarrogale-Nectogale group could not be verified, which implies the possibility of multiple origins for the semi-aquatic mode of living among taxa within Neomyini. Episoriculus may contain several separate genera. Blarinella was included in Blarinini not Soricini, based on the cytb sequences, but the confidence level was rather low; hence more phylogenetic information is needed to determine its phylogenetic position. Furthermore, some specific problems of taxonomy of soricid shrews were clarified, for example phylogeny of local populations of Notiosorex crawfordi, Chimarrogale himalayica and Crocidura attenuata.
Resumo:
Predictive groundwater modeling requires accurate information about aquifer characteristics. Geophysical imaging is a powerful tool for delineating aquifer properties at an appropriate scale and resolution, but it suffers from problems of ambiguity. One way to overcome such limitations is to adopt a simultaneous multitechnique inversion strategy. We have developed a methodology for aquifer characterization based on structural joint inversion of multiple geophysical data sets followed by clustering to form zones and subsequent inversion for zonal parameters. Joint inversions based on cross-gradient structural constraints require less restrictive assumptions than, say, applying predefined petro-physical relationships and generally yield superior results. This approach has, for the first time, been applied to three geophysical data types in three dimensions. A classification scheme using maximum likelihood estimation is used to determine the parameters of a Gaussian mixture model that defines zonal geometries from joint-inversion tomograms. The resulting zones are used to estimate representative geophysical parameters of each zone, which are then used for field-scale petrophysical analysis. A synthetic study demonstrated how joint inversion of seismic and radar traveltimes and electrical resistance tomography (ERT) data greatly reduces misclassification of zones (down from 21.3% to 3.7%) and improves the accuracy of retrieved zonal parameters (from 1.8% to 0.3%) compared to individual inversions. We applied our scheme to a data set collected in northeastern Switzerland to delineate lithologic subunits within a gravel aquifer. The inversion models resolve three principal subhorizontal units along with some important 3D heterogeneity. Petro-physical analysis of the zonal parameters indicated approximately 30% variation in porosity within the gravel aquifer and an increasing fraction of finer sediments with depth.
Resumo:
BACKGROUND: The Marburg Heart Score (MHS) aims to assist GPs in safely ruling out coronary heart disease (CHD) in patients presenting with chest pain, and to guide management decisions. AIM: To investigate the diagnostic accuracy of the MHS in an independent sample and to evaluate the generalisability to new patients. DESIGN AND SETTING: Cross-sectional diagnostic study with delayed-type reference standard in general practice in Hesse, Germany. METHOD: Fifty-six German GPs recruited 844 males and females aged ≥ 35 years, presenting between July 2009 and February 2010 with chest pain. Baseline data included the items of the MHS. Data on the subsequent course of chest pain, investigations, hospitalisations, and medication were collected over 6 months and were reviewed by an independent expert panel. CHD was the reference condition. Measures of diagnostic accuracy included the area under the receiver operating characteristic curve (AUC), sensitivity, specificity, likelihood ratios, and predictive values. RESULTS: The AUC was 0.84 (95% confidence interval [CI] = 0.80 to 0.88). For a cut-off value of 3, the MHS showed a sensitivity of 89.1% (95% CI = 81.1% to 94.0%), a specificity of 63.5% (95% CI = 60.0% to 66.9%), a positive predictive value of 23.3% (95% CI = 19.2% to 28.0%), and a negative predictive value of 97.9% (95% CI = 96.2% to 98.9%). CONCLUSION: Considering the diagnostic accuracy of the MHS, its generalisability, and ease of application, its use in clinical practice is recommended.
Resumo:
BACKGROUND: Legionella species cause severe forms of pneumonia with high mortality and complication rates. Accurate clinical predictors to assess the likelihood of Legionella community-acquired pneumonia (CAP) in patients presenting to the emergency department are lacking. METHODS: We retrospectively compared clinical and laboratory data of 82 consecutive patients with Legionella CAP with 368 consecutive patients with non-Legionella CAP included in two studies at the same institution. RESULTS: In multivariate logistic regression analysis we identified six parameters, namely high body temperature (OR 1.67, p < 0.0001), absence of sputum production (OR 3.67, p < 0.0001), low serum sodium concentrations (OR 0.89, p = 0.011), high levels of lactate dehydrogenase (OR 1.003, p = 0.007) and C-reactive protein (OR 1.006, p < 0.0001) and low platelet counts (OR 0.991, p < 0.0001), as independent predictors of Legionella CAP. Using optimal cut off values of these six parameters, we calculated a diagnostic score for Legionella CAP. The median score was significantly higher in Legionella CAP as compared to patients without Legionella (4 (IQR 3-4) vs 2 (IQR 1-2), p < 0.0001) with a respective odds ratio of 3.34 (95%CI 2.57-4.33, p < 0.0001). Receiver operating characteristics showed a high diagnostic accuracy of this diagnostic score (AUC 0.86 (95%CI 0.81-0.90), which was better as compared to each parameter alone. Of the 191 patients (42%) with a score of 0 or 1 point, only 3% had Legionella pneumonia. Conversely, of the 73 patients (16%) with > or =4 points, 66% of patients had Legionella CAP. CONCLUSION: Six clinical and laboratory parameters embedded in a simple diagnostic score accurately identified patients with Legionella CAP. If validated in future studies, this score might aid in the management of suspected Legionella CAP.