915 resultados para Database search Evidential value Bayesian decision theory Influence diagrams


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dolgozatban a döntéselméletben fontos szerepet játszó páros összehasonlítás mátrix prioritásvektorának meghatározására új megközelítést alkalmazunk. Az A páros összehasonlítás mátrix és a prioritásvektor által definiált B konzisztens mátrix közötti eltérést a Kullback-Leibler relatív entrópia-függvény segítségével mérjük. Ezen eltérés minimalizálása teljesen kitöltött mátrix esetében konvex programozási feladathoz vezet, nem teljesen kitöltött mátrix esetében pedig egy fixpont problémához. Az eltérésfüggvényt minimalizáló prioritásvektor egyben azzal a tulajdonsággal is rendelkezik, hogy az A mátrix elemeinek összege és a B mátrix elemeinek összege közötti különbség éppen az eltérésfüggvény minimumának az n-szerese, ahol n a feladat mérete. Így az eltérésfüggvény minimumának értéke két szempontból is lehet alkalmas az A mátrix inkonzisztenciájának a mérésére. _____ In this paper we apply a new approach for determining a priority vector for the pairwise comparison matrix which plays an important role in Decision Theory. The divergence between the pairwise comparison matrix A and the consistent matrix B defined by the priority vector is measured with the help of the Kullback-Leibler relative entropy function. The minimization of this divergence leads to a convex program in case of a complete matrix, leads to a fixed-point problem in case of an incomplete matrix. The priority vector minimizing the divergence also has the property that the difference of the sums of elements of the matrix A and the matrix B is n times the minimum of the divergence function where n is the dimension of the problem. Thus we developed two reasons for considering the value of the minimum of the divergence as a measure of inconsistency of the matrix A.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is generally believed that restaurant reviews can influence consumers' decisions in choosing a restaurant. A survey administered to a sample of 420 college faculty and staff members suggests that while most restaurant patrons may read reviews, they are not used as the sole selection criterion. Recommendations of friends, the restaurant's current reputation, and perceived value may have greater influence upon the choice than does a re- view. The authors discuss the implications of both favorable and unfavorable reviews.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An inference task in one in which some known set of information is used to produce an estimate about an unknown quantity. Existing theories of how humans make inferences include specialized heuristics that allow people to make these inferences in familiar environments quickly and without unnecessarily complex computation. Specialized heuristic processing may be unnecessary, however; other research suggests that the same patterns in judgment can be explained by existing patterns in encoding and retrieving memories. This dissertation compares and attempts to reconcile three alternate explanations of human inference. After justifying three hierarchical Bayesian version of existing inference models, the three models are com- pared on simulated, observed, and experimental data. The results suggest that the three models capture different patterns in human behavior but, based on posterior prediction using laboratory data, potentially ignore important determinants of the decision process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción: La minería es considerada uno de los sectores económicos más importantes por su capacidad para generar recursos en su propio sector y en otros sectores como metalmecánica, agricultura e informática entre otros, y por su contribución al desarrollo socioeconómico sostenible de las poblaciones. Objetivo: Determinar la relación entre los riesgos percibidos por los trabajadores que laboran en minería subterránea en 3 departamentos de Colombia y los Accidentes de Trabajo (AT) y Enfermedades Laborales (EL). Materiales y Métodos: Estudio de corte transversal en 476 trabajadores de minería subterránea. Se incluyeron variables independientes (características sociodemográficas y laborales y percepción del riesgo) y variables dependientes (enfermedad laboral y accidente de trabajo), obtenidas a través de una entrevista directa aplicada por profesionales de la salud previamente capacitados. Para el análisis estadístico se utilizó la Prueba Exacta de Fisher, el Odds Ratio (OR) con el Intervalo de Confianza (IC) del 95%. Resultados: En los trabajadores de minería subterránea en los departamentos de Boyacá, Cundinamarca y Santander, se encontró relación estadística significativa entre la accidentalidad con la percepción de riesgo por iluminación (OR= 2.059, IC= 95%: 1.116, 3.798, p=0.013), percepción de riesgo por movimientos repetitivos (OR= 1.951, IC= 95%: 0.998, 3.815, p=0.034), percepción de riesgo por ruido (OR= 2.275, IC= 95%: 0.974, 5.312, p=0.039) y percepción de riesgo por manejo de cargas (OR= 1.778, IC= 95%: 0.969, 3.264, p=0.041). Conclusión: se encontró que existe una relación significativa entre la percepción de riesgo de los trabajadores de minería subterránea con accidentes de trabajo y que no existe relación entre esta percepción y las enfermedades laborales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente artículo contribuye con la investigación de las Finanzas Corporativas del Comportamiento, rama de las finanzas corporativas que considera que el individuo que toma decisiones financieras no es completamente racional y que por hecho existen sesgos psicológicos que influyen en sus decisiones. Este documento se enfoca, desde el punto de vista conceptual y también mediante el análisis de un estudio de campo, en la influencia de la felicidad en las decisiones de inversión en activos de largo plazo para un grupo de siete gerentes ubicados en la ciudad de Bogotá en el año 2016. En el documento se abarca el concepto general de las finanzas corporativas del comportamiento, se define la felicidad y se presentan sub-variables determinantes para la felicidad del individuo como lo son: salud, balance vida/trabajo, educación y habilidades, conexiones sociales y medio ambiente. Finalmente se presenta cómo éstas afectan a los gerentes financieros en sus decisiones de acuerdo a la investigación realizada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Often in biomedical research, we deal with continuous (clustered) proportion responses ranging between zero and one quantifying the disease status of the cluster units. Interestingly, the study population might also consist of relatively disease-free as well as highly diseased subjects, contributing to proportion values in the interval [0, 1]. Regression on a variety of parametric densities with support lying in (0, 1), such as beta regression, can assess important covariate effects. However, they are deemed inappropriate due to the presence of zeros and/or ones. To evade this, we introduce a class of general proportion density, and further augment the probabilities of zero and one to this general proportion density, controlling for the clustering. Our approach is Bayesian and presents a computationally convenient framework amenable to available freeware. Bayesian case-deletion influence diagnostics based on q-divergence measures are automatic from the Markov chain Monte Carlo output. The methodology is illustrated using both simulation studies and application to a real dataset from a clinical periodontology study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Establishing biological control agents in the field is a major step in any classical biocontrol programme, yet there are few general guidelines to help the practitioner decide what factors might enhance the establishment of such agents. 2. A stochastic dynamic programming (SDP) approach, linked to a metapopulation model, was used to find optimal release strategies (number and size of releases), given constraints on time and the number of biocontrol agents available. By modelling within a decision-making framework we derived rules of thumb that will enable biocontrol workers to choose between management options, depending on the current state of the system. 3. When there are few well-established sites, making a few large releases is the optimal strategy. For other states of the system, the optimal strategy ranges from a few large releases, through a mixed strategy (a variety of release sizes), to many small releases, as the probability of establishment of smaller inocula increases. 4. Given that the probability of establishment is rarely a known entity, we also strongly recommend a mixed strategy in the early stages of a release programme, to accelerate learning and improve the chances of finding the optimal approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mitochondrial DNA (mtDNA) population data for forensic purposes are still scarce for some populations, which may limit the evaluation of forensic evidence especially when the rarity of a haplotype needs to be determined in a database search. In order to improve the collection of mtDNA lineages from the Iberian and South American subcontinents, we here report the results of a collaborative study involving nine laboratories from the Spanish and Portuguese Speaking Working Group of the International Society for Forensic Genetics (GHEP-ISFG) and EMPOP. The individual laboratories contributed population data that were generated throughout the past 10 years, but in the majority of cases have not been made available to the scientific community. A total of 1019 haplotypes from Iberia (Basque Country, 2 general Spanish populations, 2 North and 1 Central Portugal populations), and Latin America (3 populations from Sao Paulo) were collected, reviewed and harmonized according to defined EMPOP criteria. The majority of data ambiguities that were found during the reviewing process (41 in total) were transcription errors confirming that the documentation process is still the most error-prone stage in reporting mtDNA population data, especially when performed manually. This GHEP-EMPOP collaboration has significantly improved the quality of the individual mtDNA datasets and adds mtDNA population data as valuable resource to the EMPOP database (www.empop.org). (C) 2010 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims We conducted a meta-analysis to evaluate the accuracy of quantitative stress myocardial contrast echocardiography (MCE) in coronary artery disease (CAD). Methods and results Database search was performed through January 2008. We included studies evaluating accuracy of quantitative stress MCE for detection of CAD compared with coronary angiography or single-photon emission computed tomography (SPECT) and measuring reserve parameters of A, beta, and A beta. Data from studies were verified and supplemented by the authors of each study. Using random effects meta-analysis, we estimated weighted mean difference (WMD), likelihood ratios (LRs), diagnostic odds ratios (DORs), and summary area under curve (AUC), all with 95% confidence interval (0). Of 1443 studies, 13 including 627 patients (age range, 38-75 years) and comparing MCE with angiography (n = 10), SPECT (n = 1), or both (n = 2) were eligible. WMD (95% CI) were significantly less in CAD group than no-CAD group: 0.12 (0.06-0.18) (P < 0.001), 1.38 (1.28-1.52) (P < 0.001), and 1.47 (1.18-1.76) (P < 0.001) for A, beta, and A beta reserves, respectively. Pooled LRs for positive test were 1.33 (1.13-1.57), 3.76 (2.43-5.80), and 3.64 (2.87-4.78) and LRs for negative test were 0.68 (0.55-0.83), 0.30 (0.24-0.38), and 0.27 (0.22-0.34) for A, beta, and A beta reserves, respectively. Pooled DORs were 2.09 (1.42-3.07), 15.11 (7.90-28.91), and 14.73 (9.61-22.57) and AUCs were 0.637 (0.594-0.677), 0.851 (0.828-0.872), and 0.859 (0.842-0.750) for A, beta, and A beta reserves, respectively. Conclusion Evidence supports the use of quantitative MCE as a non-invasive test for detection of CAD. Standardizing MCE quantification analysis and adherence to reporting standards for diagnostic tests could enhance the quality of evidence in this field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report a further characterization of the genomic region containing the soybean supernodulation gene NTS-1. We performed a search for new markers linked to NTS-1 by combining DNA amplification fingerprinting (DAF) and bulked segregant analysis (BSA). The search resulted in one cloned polymorphism (B44-456) linked in trans, 8.5cM from the locus. Southern hybridization showed duplication of the B44-456 sequence in the soybean genome. Additionally, a DNA database search revealed one Arabidopsis thaliana genomic clone from chromosome I possessing 62% homology to the B44-456 marker. A relatively low number of polymorphisms were identified by several PCR marker technologies for this soybean genomic region, providing an additional support for its highly conserved and/or duplicated organization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel maximum-likelihood-based algorithm for estimating the distribution of alignment scores from the scores of unrelated sequences in a database search. Using a new method for measuring the accuracy of p-values, we show that our maximum-likelihood-based algorithm is more accurate than existing regression-based and lookup table methods. We explore a more sophisticated way of modeling and estimating the score distributions (using a two-component mixture model and expectation maximization), but conclude that this does not improve significantly over simply ignoring scores with small E-values during estimation. Finally, we measure the classification accuracy of p-values estimated in different ways and observe that inaccurate p-values can, somewhat paradoxically, lead to higher classification accuracy. We explain this paradox and argue that statistical accuracy, not classification accuracy, should be the primary criterion in comparisons of similarity search methods that return p-values that adjust for target sequence length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Like many states and territories, South Australia has a legacy of marine reserves considered to be inadequate to meet current conservation objectives. In this paper we configured exploratory marine reserve systems, using the software MARXAN, to examine how efficiently South Australia's existing marine reserves contribute to quantitative biodiversity conservation targets. Our aim was to compare marine reserve systems that retain South Australia's existing marine reserves with reserve systems that are free to either ignore or incorporate them. We devised a new interpretation of irreplaceability to identify planning units selected more than could be expected from chance alone. This is measured by comparing the observed selection frequency for an individual planning unit with a predicted selection frequency distribution. Knowing which sites make a valuable contribution to efficient marine reserve system design allows us to determine how well South Australia's existing reserves contribute to reservation goals when representation targets are set at 5, 10, 15, 20, 30 and 50% of conservation features. Existing marine reserves that tail to contribute to efficient marine reserve systems constitute 'opportunity costs'. We found that despite spanning less than 4% of South Australian state waters, locking in the existing ad hoc marine reserves presented considerable opportunity costs. Even with representation targets set at 50%, more than halt of South Australia's existing marine reserves were selected randomly or less in efficient marine reserve systems. Hence, ad hoc marine reserve systems are likely to be inefficient and may compromise effective conservation of marine biodiversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ENEGI 2013: Atas do 2º Encontro Nacional de Engenharia e Gestão Industrial, Universidade de Aveiro, 17 e 18 de maio de 2013, Aveiro, Portugal.