112 resultados para Reading and Interpretation of Statistical Graphs
Resumo:
The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.
Resumo:
This paper reports on the purpose, design, methodology and target audience of E-learning courses in forensic interpretation offered by the authors since 2010, including practical experiences made throughout the implementation period of this project. This initiative was motivated by the fact that reporting results of forensic examinations in a logically correct and scientifically rigorous way is a daily challenge for any forensic practitioner. Indeed, interpretation of raw data and communication of findings in both written and oral statements are topics where knowledge and applied skills are needed. Although most forensic scientists hold educational records in traditional sciences, only few actually followed full courses that focussed on interpretation issues. Such courses should include foundational principles and methodology - including elements of forensic statistics - for the evaluation of forensic data in a way that is tailored to meet the needs of the criminal justice system. In order to help bridge this gap, the authors' initiative seeks to offer educational opportunities that allow practitioners to acquire knowledge and competence in the current approaches to the evaluation and interpretation of forensic findings. These cover, among other aspects, probabilistic reasoning (including Bayesian networks and other methods of forensic statistics, tools and software), case pre-assessment, skills in the oral and written communication of uncertainty, and the development of independence and self-confidence to solve practical inference problems. E-learning was chosen as a general format because it helps to form a trans-institutional online-community of practitioners from varying forensic disciplines and workfield experience such as reporting officers, (chief) scientists, forensic coordinators, but also lawyers who all can interact directly from their personal workplaces without consideration of distances, travel expenses or time schedules. In the authors' experience, the proposed learning initiative supports participants in developing their expertise and skills in forensic interpretation, but also offers an opportunity for the associated institutions and the forensic community to reinforce the development of a harmonized view with regard to interpretation across forensic disciplines, laboratories and judicial systems.
Resumo:
Background: We aimed to analyze the rate and time distribution of pre- and post-morbid cerebrovascular events in a single ischemic stroke population, and whether these depend on the etiology of the index stroke. Methods: In 2,203 consecutive patients admitted to a single stroke center registry (ASTRAL), the ischemic stroke that led to admission was considered the index event. Frequency distribution and cumulative relative distribution graphs of the most recent and first recurrent event (ischemic stroke, transient ischemic attack, intracranial or subarachnoid hemorrhage) were drawn in weekly and daily intervals for all strokes and for all stroke types. Results: The frequency of events at identical time points before and after the index stroke was mostly reduced in the first week after (vs. before) stroke (1.0 vs. 4.2%, p < 0.001) and the first month (2.7 vs. 7.4%, p < 0.001), and then ebbed over the first year (8.4 vs. 13.1%, p < 0.001). On daily basis, the peak frequency was noticed at day -1 (1.6%) with a reduction to 0.7% on the index day and 0.17% 24 h after. The event rate in patients with atherosclerotic stroke was particularly high around the index event, but 1-year cumulative recurrence rate was similar in all stroke types. Conclusions: We confirm a short window of increased vulnerability in ischemic stroke and show a 4-, 3- and 2-fold reduction in post-stroke events at 1 week, 1 month and 1 year, respectively, compared to identical pre-stroke periods. This break in the 'stroke wave' is particularly striking after atherosclerotic and lacunar strokes.
Resumo:
BACKGROUND & AIMS: Standardized instruments are needed to assess the activity of eosinophilic esophagitis (EoE) and to provide end points for clinical trials and observational studies. We aimed to develop and validate a patient-reported outcome (PRO) instrument and score, based on items that could account for variations in patient assessments of disease severity. We also evaluated relationships between patient assessment of disease severity and EoE-associated endoscopic, histologic, and laboratory findings. METHODS: We collected information from 186 patients with EoE in Switzerland and the United States (69.4% male; median age, 43 y) via surveys (n = 135), focus groups (n = 27), and semistructured interviews (n = 24). Items were generated for the instruments to assess biologic activity based on physician input. Linear regression was used to quantify the extent to which variations in patient-reported disease characteristics could account for variations in patient assessment of EoE severity. The PRO instrument was used prospectively in 153 adult patients with EoE (72.5% male; median age, 38 y), and validated in an independent group of 120 patients with EoE (60.8% male; median age, 40.5 y). RESULTS: Seven PRO factors that are used to assess characteristics of dysphagia, behavioral adaptations to living with dysphagia, and pain while swallowing accounted for 67% of the variation in patient assessment of disease severity. Based on statistical consideration and patient input, a 7-day recall period was selected. Highly active EoE, based on endoscopic and histologic findings, was associated with an increase in patient-assessed disease severity. In the validation study, the mean difference between patient assessment of EoE severity (range, 0-10) and PRO score (range, 0-8.52) was 0.15. CONCLUSIONS: We developed and validated an EoE scoring system based on 7 PRO items that assess symptoms over a 7-day recall period. Clinicaltrials.gov number: NCT00939263.
Resumo:
ObjectiveCandidate genes for non-alcoholic fatty liver disease (NAFLD) identified by a bioinformatics approach were examined for variant associations to quantitative traits of NAFLD-related phenotypes.Research Design and MethodsBy integrating public database text mining, trans-organism protein-protein interaction transferal, and information on liver protein expression a protein-protein interaction network was constructed and from this a smaller isolated interactome was identified. Five genes from this interactome were selected for genetic analysis. Twenty-one tag single-nucleotide polymorphisms (SNPs) which captured all common variation in these genes were genotyped in 10,196 Danes, and analyzed for association with NAFLD-related quantitative traits, type 2 diabetes (T2D), central obesity, and WHO-defined metabolic syndrome (MetS).Results273 genes were included in the protein-protein interaction analysis and EHHADH, ECHS1, HADHA, HADHB, and ACADL were selected for further examination. A total of 10 nominal statistical significant associations (P<0.05) to quantitative metabolic traits were identified. Also, the case-control study showed associations between variation in the five genes and T2D, central obesity, and MetS, respectively. Bonferroni adjustments for multiple testing negated all associations.ConclusionsUsing a bioinformatics approach we identified five candidate genes for NAFLD. However, we failed to provide evidence of associations with major effects between SNPs in these five genes and NAFLD-related quantitative traits, T2D, central obesity, and MetS.
Resumo:
BACKGROUND: Chest pain can be caused by various conditions, with life-threatening cardiac disease being of greatest concern. Prediction scores to rule out coronary artery disease have been developed for use in emergency settings. We developed and validated a simple prediction rule for use in primary care. METHODS: We conducted a cross-sectional diagnostic study in 74 primary care practices in Germany. Primary care physicians recruited all consecutive patients who presented with chest pain (n = 1249) and recorded symptoms and findings for each patient (derivation cohort). An independent expert panel reviewed follow-up data obtained at six weeks and six months on symptoms, investigations, hospital admissions and medications to determine the presence or absence of coronary artery disease. Adjusted odds ratios of relevant variables were used to develop a prediction rule. We calculated measures of diagnostic accuracy for different cut-off values for the prediction scores using data derived from another prospective primary care study (validation cohort). RESULTS: The prediction rule contained five determinants (age/sex, known vascular disease, patient assumes pain is of cardiac origin, pain is worse during exercise, and pain is not reproducible by palpation), with the score ranging from 0 to 5 points. The area under the curve (receiver operating characteristic curve) was 0.87 (95% confidence interval [CI] 0.83-0.91) for the derivation cohort and 0.90 (95% CI 0.87-0.93) for the validation cohort. The best overall discrimination was with a cut-off value of 3 (positive result 3-5 points; negative result <or= 2 points), which had a sensitivity of 87.1% (95% CI 79.9%-94.2%) and a specificity of 80.8% (77.6%-83.9%). INTERPRETATION: The prediction rule for coronary artery disease in primary care proved to be robust in the validation cohort. It can help to rule out coronary artery disease in patients presenting with chest pain in primary care.
Resumo:
Due to various contexts and processes, forensic science communities may have different approaches, largely influenced by their criminal justice systems. However, forensic science practices share some common characteristics. One is the assurance of a high (scientific) quality within processes and practices. For most crime laboratory directors and forensic science associations, this issue is conditioned by the triangle of quality, which represents the current paradigm of quality assurance in the field. It consists of the implementation of standardization, certification, accreditation, and an evaluation process. It constitutes a clear and sound way to exchange data between laboratories and enables databasing due to standardized methods ensuring reliable and valid results; but it is also a means of defining minimum requirements for practitioners' skills for specific forensic science activities. The control of each of these aspects offers non-forensic science partners the assurance that the entire process has been mastered and is trustworthy. Most of the standards focus on the analysis stage and do not consider pre- and post-laboratory stages, namely, the work achieved at the investigation scene and the evaluation and interpretation of the results, intended for intelligence beneficiaries or for court. Such localized consideration prevents forensic practitioners from identifying where the problems really lie with regard to criminal justice systems. According to a performance-management approach, scientific quality should not be restricted to standardized procedures and controls in forensic science practice. Ensuring high quality also strongly depends on the way a forensic science culture is assimilated (into specific education training and workplaces) and in the way practitioners understand forensic science as a whole.
Resumo:
We have used surface-based electrical resistivity tomography to detect and characterize preferential hydraulic pathways in the immediate downstream area of an abandoned, hazardous landfill. The landfill occupies the void left by a former gravel pit and its base is close to the groundwater table and lacking an engineered barrier. As such, this site is remarkably typical of many small- to medium-sized waste deposits throughout the densely populated and heavily industrialized foreland on both sides of the Alpine arc. Outflows of pollutants lastingly contaminated local drinking water supplies and necessitated a partial remediation in the form of a synthetic cover barrier, which is meant to prevent meteoric water from percolating through the waste before reaching the groundwater table. Any future additional isolation of the landfill in the form of lateral barriers thus requires adequate knowledge of potential preferential hydraulic pathways for outflowing contaminants. Our results, inferred from a suite of tomographically inverted surfaced-based electrical resistivity profiles oriented roughly perpendicular to the local hydraulic gradient, indicate that potential contaminant outflows would predominantly occur along an unexploited lateral extension of the original gravel deposit. This finds its expression as a distinct and laterally continuous high-resistivity anomaly in the resistivity tomograms. This interpretation is ground-truthed through a litholog from a nearby well. Since the probed glacio-fluvial deposits are largely devoid of mineralogical clay, the geometry of hydraulic and electrical pathways across the pore space of a given lithological unit can be assumed to be identical, which allows for an order-of-magnitude estimation of the overall permeability structure. These estimates indicate that the permeability of the imaged extension of the gravel body is at least two to three orders-of-magnitude higher than that of its finer-grained embedding matrix. This corroborates the preeminent role of the high-resistivity anomaly as a potential preferential flow path.
Resumo:
*This study reconstructs the phylogeography of Aegilops geniculata, an allotetraploid relative of wheat, to discuss the impact of past climate changes and recent human activities (e.g. the early expansion of agriculture) on the genetic diversity of ruderal plant species. *We combined chloroplast DNA (cpDNA) sequencing, analysed using statistical parsimony network, with nonhierarchical K-means clustering of amplified fragment length polymorphism (AFLP) genotyping, to unravel patterns of genetic structure across the native range of Ae. geniculata. The AFLP dataset was further explored by measurement of the regional genetic diversity and the detection of isolation by distance patterns. *Both cpDNA and AFLP suggest an eastern Mediterranean origin of Ae. geniculata. Two lineages have spread independently over northern and southern Mediterranean areas. Northern populations show low genetic diversity but strong phylogeographical structure among the main peninsulas, indicating a major influence of glacial cycles. By contrast, low genetic structuring and a high genetic diversity are detected in southern Mediterranean populations. Finally, we highlight human-mediated dispersal resulting in substantial introgression between resident and migrant populations. *We have shown that the evolutionary trajectories of ruderal plants can be similar to those of wild species, but are interfered by human activities, promoting range expansions through increased long-distance dispersal and the creation of suitable habitats.
Resumo:
We assessed whether fasting modifies the prognostic value of these measurements for the risk of myocardial infarction (MI). Analyses used mixed effect models and Poisson regression. After confounders were controlled for, fasting triglyceride levels were, on average, 0.122 mmol/L lower than nonfasting levels. Each 2-fold increase in the latest triglyceride level was associated with a 38% increase in MI risk (relative rate, 1.38; 95% confidence interval, 1.26-1.51); fasting status did not modify this association. Our results suggest that it may not be necessary to restrict analyses to fasting measurements when considering MI risk.
Resumo:
This paper evaluates the reception of Léon Walras' ideas in Russia before 1920. Despite an unfavourable institutional context, Walras was read by Russian economists. On the one hand, Bortkiewicz and Winiarski, who lived outside Russia and had the opportunity to meet and correspond with Walras, were first class readers and very good ambassadors for Walras' ideas, while on the other, the economists living in Russia were more selective in their readings. They restricted themselves to Walras' Elements of Pure Economics, in particular, its theory of exchange, while ignoring its theory of production. We introduce a cultural argument to explain their selective reading. JEL classification numbers: B 13, B 19.
Resumo:
BACKGROUND: Present combination antiretroviral therapy (cART) alone does not cure HIV infection and requires lifelong drug treatment. The potential role of HIV therapeutic vaccines as part of an HIV cure is under consideration. Our aim was to assess the efficacy, safety, and immunogenicity of Vacc-4x, a peptide-based HIV-1 therapeutic vaccine targeting conserved domains on p24(Gag), in adults infected with HIV-1. METHODS: Between July, 2008, and June, 2010, we did a multinational double-blind, randomised, phase 2 study comparing Vacc-4x with placebo. Participants were adults infected with HIV-1 who were aged 18-55 years and virologically suppressed on cART (viral load <50 copies per mL) with CD4 cell counts of 400 × 10(6) cells per L or greater. The trial was done at 18 sites in Germany, Italy, Spain, the UK, and the USA. Participants were randomly assigned (2:1) to Vacc-4x or placebo. Group allocation was masked from participants and investigators. Four primary immunisations, weekly for 4 weeks, containing Vacc-4x (or placebo) were given intradermally after administration of adjuvant. Booster immunisations were given at weeks 16 and 18. At week 28, cART was interrupted for up to 24 weeks. The coprimary endpoints were cART resumption and changes in CD4 counts during treatment interruption. Analyses were by modified intention to treat: all participants who received one intervention. Furthermore, safety, viral load, and immunogenicity (as measured by ELISPOT and proliferation assays) were assessed. The 52 week follow-up period was completed in June, 2011. For the coprimary endpoints the proportion of participants who met the criteria for cART resumption was analysed with a logistic regression model with the treatment effect being assessed in a model including country as a covariate. This study is registered with ClinicalTrials.gov, number NCT00659789. FINDINGS: 174 individuals were screened; because of slow recruitment, enrolment stopped with 136 of a planned 345 participants and 93 were randomly assigned to receive Vacc-4x and 43 to receive placebo. There were no differences between the two groups for the primary efficacy endpoints in those participants who stopped cART at week 28. Of the participants who resumed cART, 30 (34%) were in the Vacc-4x group and 11 (29%) in the placebo group, and percentage changes in CD4 counts were not significant (mean treatment difference -5·71, 95% CI -13·01 to 1·59). However, a significant difference in viral load was noted for the Vacc-4x group both at week 48 (median 23 100 copies per mL Vacc-4x vs 71 800 copies per mL placebo; p=0·025) and week 52 (median 19 550 copies per mL vs 51 000 copies per mL; p=0·041). One serious adverse event, exacerbation of multiple sclerosis, was reported as possibly related to study treatment. Vacc-4x was immunogenic, inducing proliferative responses in both CD4 and CD8 T-cell populations. INTERPRETATION: The proportion of participants resuming cART before end of study and change in CD4 counts during the treatment interruption showed no benefit of vaccination. Vacc-4x was safe, well tolerated, immunogenic, seemed to contribute to a viral-load setpoint reduction after cART interruption, and might be worth consideration in future HIV-cure investigative strategies. FUNDING: Norwegian Research Council GLOBVAC Program and Bionor Pharma ASA.
Resumo:
BACKGROUND: Aromatase inhibitors provide superior disease control when compared with tamoxifen as adjuvant therapy for postmenopausal women with endocrine-responsive early breast cancer. PURPOSE: To present the design, history, and analytic challenges of the Breast International Group (BIG) 1-98 trial: an international, multicenter, randomized, double-blind, phase-III study comparing the aromatase inhibitor letrozole with tamoxifen in this clinical setting. METHODS: From 1998-2003, BIG 1-98 enrolled 8028 women to receive monotherapy with either tamoxifen or letrozole for 5 years, or sequential therapy of 2 years of one agent followed by 3 years of the other. Randomization to one of four treatment groups permitted two complementary analyses to be conducted several years apart. The first, reported in 2005, provided a head-to-head comparison of letrozole versus tamoxifen. Statistical power was increased by an enriched design, which included patients who were assigned sequential treatments until the time of the treatment switch. The second, reported in late 2008, used a conditional landmark approach to test the hypothesis that switching endocrine agents at approximately 2 years from randomization for patients who are disease-free is superior to continuing with the original agent. RESULTS: The 2005 analysis showed the superiority of letrozole compared with tamoxifen. The patients who were assigned tamoxifen alone were unblinded and offered the opportunity to switch to letrozole. Results from other trials increased the clinical relevance about whether or not to start treatment with letrozole or tamoxifen, and analysis plans were expanded to evaluate sequential versus single-agent strategies from randomization. LIMITATIONS: Due to the unblinding of patients assigned tamoxifen alone, analysis of updated data will require ascertainment of the influence of selective crossover from tamoxifen to letrozole. CONCLUSIONS: BIG 1-98 is an example of an enriched design, involving complementary analyses addressing different questions several years apart, and subject to evolving analytic plans influenced by new data that emerge over time.
Resumo:
This paper presents a statistical model for the quantification of the weight of fingerprint evidence. Contrarily to previous models (generative and score-based models), our model proposes to estimate the probability distributions of spatial relationships, directions and types of minutiae observed on fingerprints for any given fingermark. Our model is relying on an AFIS algorithm provided by 3M Cogent and on a dataset of more than 4,000,000 fingerprints to represent a sample from a relevant population of potential sources. The performance of our model was tested using several hundreds of minutiae configurations observed on a set of 565 fingermarks. In particular, the effects of various sub-populations of fingers (i.e., finger number, finger general pattern) on the expected evidential value of our test configurations were investigated. The performance of our model indicates that the spatial relationship between minutiae carries more evidential weight than their type or direction. Our results also indicate that the AFIS component of our model directly enables us to assign weight to fingerprint evidence without the need for the additional layer of complex statistical modeling involved by the estimation of the probability distributions of fingerprint features. In fact, it seems that the AFIS component is more sensitive to the sub-population effects than the other components of the model. Overall, the data generated during this research project contributes to support the idea that fingerprint evidence is a valuable forensic tool for the identification of individuals.
Resumo:
OBJECTIVE: To determine risk of Down syndrome (DS) in multiple relative to singleton pregnancies, and compare prenatal diagnosis rates and pregnancy outcome. DESIGN: Population-based prevalence study based on EUROCAT congenital anomaly registries. SETTING: Eight European countries. POPULATION: 14.8 million births 1990-2009; 2.89% multiple births. METHODS: DS cases included livebirths, fetal deaths from 20 weeks, and terminations of pregnancy for fetal anomaly (TOPFA). Zygosity is inferred from like/unlike sex for birth denominators, and from concordance for DS cases. MAIN OUTCOME MEASURES: Relative risk (RR) of DS per fetus/baby from multiple versus singleton pregnancies and per pregnancy in monozygotic/dizygotic versus singleton pregnancies. Proportion of prenatally diagnosed and pregnancy outcome. STATISTICAL ANALYSIS: Poisson and logistic regression stratified for maternal age, country and time. RESULTS: Overall, the adjusted (adj) RR of DS for fetus/babies from multiple versus singleton pregnancies was 0.58 (95% CI 0.53-0.62), similar for all maternal ages except for mothers over 44, for whom it was considerably lower. In 8.7% of twin pairs affected by DS, both co-twins were diagnosed with the condition. The adjRR of DS for monozygotic versus singleton pregnancies was 0.34 (95% CI 0.25-0.44) and for dizygotic versus singleton pregnancies 1.34 (95% CI 1.23-1.46). DS fetuses from multiple births were less likely to be prenatally diagnosed than singletons (adjOR 0.62 [95% CI 0.50-0.78]) and following diagnosis less likely to be TOPFA (adjOR 0.40 [95% CI 0.27-0.59]). CONCLUSIONS: The risk of DS per fetus/baby is lower in multiple than singleton pregnancies. These estimates can be used for genetic counselling and prenatal screening.