919 resultados para GRAPHICAL LASSO
Resumo:
As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).
Resumo:
Research has shown that one of the major contributing factors in early joint deterioration of portland cement concrete (PCC) pavement is the quality of the coarse aggregate. Conventional physical and freeze/thaw tests are slow and not satisfactory in evaluating aggregate quality. In the last ten years the Iowa DOT has been evaluating X-ray analysis and other new technologies to predict aggregate durability in PCC pavement. The objective of this research is to evaluate thermogravimetric analysis (TGA) of carbonate aggregate. The TGA testing has been conducted with a TA 2950 Thermogravimetric Analyzer. The equipment is controlled by an IBM compatible computer. A "TA Hi-RES" (trademark) software package allows for rapid testing while retaining high resolution. The carbon dioxide is driven off the dolomite fraction between 705 deg C and 745 deg C and off the calcite fraction between 905 deg C and 940 deg C. The graphical plot of the temperature and weight loss using the same sample size and test procedure demonstrates that the test is very accurate and repeatable. A substantial number of both dolomites and limestones (calcites) have been subjected to TGA testing. The slopes of the weight loss plot prior to the dolomite and calcite transitions does correlate with field performance. The noncarbonate fraction, which correlates to the acid insolubles, can be determined by TGA for most calcites and some dolomites. TGA has provided information that can be used to help predict the quality of carbonate aggregate.
Resumo:
Interdependence is the main feature of dyadic relationships and, in recent years, various statistical procedures have been proposed for quantifying and testing this social attribute in different dyadic designs. The purpose of this paper is to develop several functions for this kind of statistical tests in an R package, known as nonindependence, for use by applied social researchers. A Graphical User Interface (GUI) is also developed to facilitate the use of the functions included in this package. Examples drawn from psychological research and simulated data are used to illustrate how the software works.
Resumo:
BACKGROUND: Practicing physicians are faced with many medical decisions daily. These are mainly influenced by personal experience but should also consider patient preferences and the scientific evidence reflected by a constantly increasing number of medical publications and guidelines. With the objective of optimal medical treatment, the concept of evidence-based medicine is founded on these three aspects. It should be considered that there is a high risk of misinterpreting evidence, leading to medical errors and adverse effects without knowledge of the methodological background. OBJECTIVES: This article explains the concept of systematic error (bias) and its importance. Causes and effects as well as methods to minimize bias are discussed. This information should impart a deeper understanding, leading to a better assessment of studies and implementation of its recommendations in daily medical practice. CONCLUSION: Developed by the Cochrane Collaboration, the risk of bias (RoB) tool is an assessment instrument for the potential of bias in controlled trials. Good handling, short processing time, high transparency of judgements and a graphical presentation of findings that is easily comprehensible are among its strengths. Attached to this article the German translation of the RoB tool is published. This should facilitate the applicability for non-experts and moreover, support evidence-based medical decision-making.
Resumo:
The broad aim of biomedical science in the postgenomic era is to link genomic and phenotype information to allow deeper understanding of the processes leading from genomic changes to altered phenotype and disease. The EuroPhenome project (http://www.EuroPhenome.org) is a comprehensive resource for raw and annotated high-throughput phenotyping data arising from projects such as EUMODIC. EUMODIC is gathering data from the EMPReSSslim pipeline (http://www.empress.har.mrc.ac.uk/) which is performed on inbred mouse strains and knock-out lines arising from the EUCOMM project. The EuroPhenome interface allows the user to access the data via the phenotype or genotype. It also allows the user to access the data in a variety of ways, including graphical display, statistical analysis and access to the raw data via web services. The raw phenotyping data captured in EuroPhenome is annotated by an annotation pipeline which automatically identifies statistically different mutants from the appropriate baseline and assigns ontology terms for that specific test. Mutant phenotypes can be quickly identified using two EuroPhenome tools: PhenoMap, a graphical representation of statistically relevant phenotypes, and mining for a mutant using ontology terms. To assist with data definition and cross-database comparisons, phenotype data is annotated using combinations of terms from biological ontologies.
Resumo:
The success of combination antiretroviral therapy is limited by the evolutionary escape dynamics of HIV-1. We used Isotonic Conjunctive Bayesian Networks (I-CBNs), a class of probabilistic graphical models, to describe this process. We employed partial order constraints among viral resistance mutations, which give rise to a limited set of mutational pathways, and we modeled phenotypic drug resistance as monotonically increasing along any escape pathway. Using this model, the individualized genetic barrier (IGB) to each drug is derived as the probability of the virus not acquiring additional mutations that confer resistance. Drug-specific IGBs were combined to obtain the IGB to an entire regimen, which quantifies the virus' genetic potential for developing drug resistance under combination therapy. The IGB was tested as a predictor of therapeutic outcome using between 2,185 and 2,631 treatment change episodes of subtype B infected patients from the Swiss HIV Cohort Study Database, a large observational cohort. Using logistic regression, significant univariate predictors included most of the 18 drugs and single-drug IGBs, the IGB to the entire regimen, the expert rules-based genotypic susceptibility score (GSS), several individual mutations, and the peak viral load before treatment change. In the multivariate analysis, the only genotype-derived variables that remained significantly associated with virological success were GSS and, with 10-fold stronger association, IGB to regimen. When predicting suppression of viral load below 400 cps/ml, IGB outperformed GSS and also improved GSS-containing predictors significantly, but the difference was not significant for suppression below 50 cps/ml. Thus, the IGB to regimen is a novel data-derived predictor of treatment outcome that has potential to improve the interpretation of genotypic drug resistance tests.
Resumo:
La interacció de l’home amb les diferents espècies amb les quals cohabita ha creat casos com el del Voltor Negre (Aegypius monachus), l’au necròfaga més gran d’Europa, que actualment es troba en perill d’extinció. La utilització del territori per a ús antropogènic (ramaderia, xarxes elèctriques, zones urbanes,...) ha fet disminuir la població d’aquesta espècie. Així com la incursió, de manera indirecta, en aquestes poblacions degut a l’acció de l’home podem destacar-ne’n dos, el més rellevant és la intoxicació per verí, degut a l’ús il·legal d’aquestes substàncies per al control de depredadors fa que els tòxics vagin pujant per la piràmide tròfica. La segona causa rellevant és la disminució d’aliment degut a la mixomatosis i a la normativa existent d’obligatorietat de retirada dels animals morts a muntanya per sota dels 1400m. Al fer l’anàlisi de problemàtiques, s’observa que la gran majoria són antròpiques, degudes a la desinformació i al desconeixement, per tant; per a realitzar un bon pla de reintroducció és necessari fer un treball previ de socialització de les poblacions de la zona per a que el projecte sigui eficient. El projecte consisteix en fer una conscienciació social als diferents grups implicats (turisme, escoles i població; ramaders i caçadors), mitjançant documents gràfics, material educatiu, material de suport i materials i mètodes de divulgació. Per a que la vall d’Alinyà i Boumort passin a ser destacades per la presencia establerta de les quatre aus necròfagues més rellevants d’Europa; l’Aufrany, el Trencalòs, el voltor comú i el nostre protagonista: el Voltor Negre.
Resumo:
Since the begínning ofAdvertising, art has been a source ofinexhaustible inspiration for the advertising creative, who have used or misused it with any limits. With more or less direct references to artists, pieces or art movements, to references to the act ofart creation itself and certain aesthetic categories. Taking into account the various semantic versions that art offers, with multiple connotative lectures, this use has not been much profitable in many occasions, not as much as it could have been. However, in other circumstances, thís use has been vampirizing, only preoccupied for the audience impact ofa well knowll reference, but despising it and reducing it to a mere reclaim. In the case ofGiocconda by Leonardo da Vinci or the Birth ofVenus by Bottícelli are significant examples ofthis use, maybe popular, ofthe art productíon. That is to say that the depository was used as a source ofreferences to enrich the lectures ofthe advertising pieces (for instance, the excellent Citroen Xsara Picasso advertisement with the fordism mass production chain) or ¡ust to allow a higher t1Otoriety (like the Chupa-Chups advertisement where the Mona Lisa heartily sucks one ofthe mythical sweets with a stick). Thanks to a selected advertising pieces, graphical and audiovisual, we will be able to go through this kind ofadvertising creation logistics, from the moment that this one decided to call out these art references that were sleeping dreamingly Oll the fair in a depository until this strictly cultural moment came.
Resumo:
Iowa has approximately 1000 bridges that have been overlaid with a nominal 2" of portland cement concrete. A Delamtect survey of a sampling of the older overlaid bridges indicated delaminations in several of them. Eventually these bridges as well as those that have not received an overlay must be programmed for rehabilitation. Prior to rehabilitation the areas which are delaminated must be identified. There are currently two standard methods of determining delaminated areas in bridge decks; sounding with a metal object or a chain drag and sounding with an electro-mechanical sounding system (Delamtect). Sounding with a metal object or chain drag is time consuming and the accuracy is dependent on the ear of the operator and may be affected by traffic noise. The Delamtect requires less field time but the graphical traces require that data reduction be done in the office. A recently developed method of detecting delamination is infrared thermography. This method is based on the temperature difference between sound and delaminated concrete. A contract was negotiated with Donohue and Associates, Inc. of Sheboygan, Wisconsin, to survey 18 p.c. concrete overlaid bridge decks in Iowa using the infrared thermography method of detecting delaminations.
Resumo:
Currently, individuals including designers, contractors, and owners learn about the project requirements by studying a combination of paper and electronic copies of the construction documents including the drawings, specifications (standard and supplemental), road and bridge standard drawings, design criteria, contracts, addenda, and change orders. This can be a tedious process since one needs to go back and forth between the various documents (paper or electronic) to obtain information about the entire project. Object-oriented computer-aided design (OO-CAD) is an innovative technology that can bring a change to this process by graphical portrayal of information. OO-CAD allows users to point and click on portions of an object-oriented drawing that are then linked to relevant databases of information (e.g., specifications, procurement status, and shop drawings). The vision of this study is to turn paper-based design standards and construction specifications into an object-oriented design and specification (OODAS) system or a visual electronic reference library (ERL). Individuals can use the system through a handheld wireless book-size laptop that includes all of the necessary software for operating in a 3D environment. All parties involved in transportation projects can access all of the standards and requirements simultaneously using a 3D graphical interface. By using this system, users will have all of the design elements and all of the specifications readily available without concerns of omissions. A prototype object-oriented model was created and demonstrated to potential users representing counties, cities, and the state. Findings suggest that a system like this could improve productivity to find information by as much as 75% and provide a greater sense of confidence that all relevant information had been identified. It was also apparent that this system would be used by more people in construction than in design. There was also concern related to the cost to develop and maintain the complete system. The future direction should focus on a project-based system that can help the contractors and DOT inspectors find information (e.g., road standards, specifications, instructional memorandums) more rapidly as it pertains to a specific project.
Resumo:
Is it possible to perfectly simulate a signature, in the particular and challenging case where the signature is simple? A set of signatures of six writers, considered to be simple on the basis of highlighted criteria, was sampled. These signatures were transferred to forgers requested to produce freehand simulations. Among these simulations, those capable of reproducing the features of the reference signatures were submitted for evaluation to forensic document experts through proficiency testing. The results suggest that there is no perfect simulation. With the supplementary aim of assessing the influence of forger's skills on the results, forgers were selected from three distinct populations, which differ according to professional criteria. The results indicate some differences in graphical capabilities between individuals. However, no trend could be established regarding age, degrees, years of practice and time dedicated to the exercise. The findings show that simulation is made easier if a graphical compatibility exists between the forger's own writing and the signature to be reproduced. Moreover, a global difficulty to preserve proportions and slant as well as the shape of capital letters and initials has been noticed.
Resumo:
In 1851 the French Social economist Auguste Ott discussed the problem of gluts and commercial crises, together with the issue of distributive justice between workers in co-operative societies. He did so by means of a 'simple reproduction scheme' sharing some features with modern intersectoral transactions tables, in particular in terms of their graphical representation. This paper presents Ott's theory of crises (which was based on the disappointment of expectations) and the context of his model, and discusses its peculiarities, supplying a new piece for the reconstruction of the prehistory of input-output analysis.
Resumo:
Well developed experimental procedures currently exist for retrieving and analyzing particle evidence from hands of individuals suspected of being associated with the discharge of a firearm. Although analytical approaches (e.g. automated Scanning Electron Microscopy with Energy Dispersive X-ray (SEM-EDS) microanalysis) allow the determination of the presence of elements typically found in gunshot residue (GSR) particles, such analyses provide no information about a given particle's actual source. Possible origins for which scientists may need to account for are a primary exposure to the discharge of a firearm or a secondary transfer due to a contaminated environment. In order to approach such sources of uncertainty in the context of evidential assessment, this paper studies the construction and practical implementation of graphical probability models (i.e. Bayesian networks). These can assist forensic scientists in making the issue tractable within a probabilistic perspective. The proposed models focus on likelihood ratio calculations at various levels of detail as well as case pre-assessment.
Resumo:
Forensic scientists face increasingly complex inference problems for evaluating likelihood ratios (LRs) for an appropriate pair of propositions. Up to now, scientists and statisticians have derived LR formulae using an algebraic approach. However, this approach reaches its limits when addressing cases with an increasing number of variables and dependence relationships between these variables. In this study, we suggest using a graphical approach, based on the construction of Bayesian networks (BNs). We first construct a BN that captures the problem, and then deduce the expression for calculating the LR from this model to compare it with existing LR formulae. We illustrate this idea by applying it to the evaluation of an activity level LR in the context of the two-trace transfer problem. Our approach allows us to relax assumptions made in previous LR developments, produce a new LR formula for the two-trace transfer problem and generalize this scenario to n traces.
Resumo:
BACKGROUND: DNA sequence polymorphisms analysis can provide valuable information on the evolutionary forces shaping nucleotide variation, and provides an insight into the functional significance of genomic regions. The recent ongoing genome projects will radically improve our capabilities to detect specific genomic regions shaped by natural selection. Current available methods and software, however, are unsatisfactory for such genome-wide analysis. RESULTS: We have developed methods for the analysis of DNA sequence polymorphisms at the genome-wide scale. These methods, which have been tested on a coalescent-simulated and actual data files from mouse and human, have been implemented in the VariScan software package version 2.0. Additionally, we have also incorporated a graphical-user interface. The main features of this software are: i) exhaustive population-genetic analyses including those based on the coalescent theory; ii) analysis adapted to the shallow data generated by the high-throughput genome projects; iii) use of genome annotations to conduct a comprehensive analyses separately for different functional regions; iv) identification of relevant genomic regions by the sliding-window and wavelet-multiresolution approaches; v) visualization of the results integrated with current genome annotations in commonly available genome browsers. CONCLUSION: VariScan is a powerful and flexible suite of software for the analysis of DNA polymorphisms. The current version implements new algorithms, methods, and capabilities, providing an important tool for an exhaustive exploratory analysis of genome-wide DNA polymorphism data.