939 resultados para Evidence Containers, Representation, Provenance, Tool Interoperability
Resumo:
In Germany's compensatory mixed electoral system, alternative electoral routes lead into parliament. We study the relationship between candidates' electoral situations across both tiers and policy representation, fully accounting for candidate, party and district preferences in a multi-actor constellation and the exact electoral incentives for candidates to represent either the party or the district. The results (2009 Bundestag election data) yield evidence of an interactive effect of closeness of the district race and list safety on candidates' positioning between their party and constituency.
Resumo:
In this work we propose the adoption of a statistical framework used in the evaluation of forensic evidence as a tool for evaluating and presenting circumstantial "evidence" of a disease outbreak from syndromic surveillance. The basic idea is to exploit the predicted distributions of reported cases to calculate the ratio of the likelihood of observing n cases given an ongoing outbreak over the likelihood of observing n cases given no outbreak. The likelihood ratio defines the Value of Evidence (V). Using Bayes' rule, the prior odds for an ongoing outbreak are multiplied by V to obtain the posterior odds. This approach was applied to time series on the number of horses showing clinical respiratory symptoms or neurological symptoms. The separation between prior beliefs about the probability of an outbreak and the strength of evidence from syndromic surveillance offers a transparent reasoning process suitable for supporting decision makers. The value of evidence can be translated into a verbal statement, as often done in forensics or used for the production of risk maps. Furthermore, a Bayesian approach offers seamless integration of data from syndromic surveillance with results from predictive modeling and with information from other sources such as disease introduction risk assessments.
Resumo:
Detrital provenance analyses in orogenic settings, in which sediments are collected at the outlet of a catchment, have become an important tool to estimate how erosion varies in space and time. Here we present how Raman Spectroscopy on Carbonaceous Material (RSCM) can be used for provenance analysis. RSCM provides an estimate of the peak temperature (RSCM-T) experienced during metamorphism. We show that we can infer modern erosion patterns in a catchment by combining new measurements on detrital sands with previously acquired bedrock data. We focus on the Whataroa catchment in the Southern Alps of New Zealand and exploit the metamorphic gradient that runs parallel to the main drainage direction. To account for potential sampling biases, we also quantify abrasion properties using flume experiments and measure the total organic carbon content in the bedrock that produced the collected sands. Finally, we integrate these parameters into a mass-conservative model. Our results first demonstrate that RSCM-T can be used for detrital studies. Second, we find that spatial variations in tracer concentration and erosion have a first-order control on the RSCM-T distributions, even though our flume experiments reveal that weak lithologies produce substantially more fine particles than do more durable lithologies. This result implies that sand specimens are good proxies for mapping spatial variations in erosion when the bedrock concentration of the target mineral is quantified. The modeling suggests that highest present-day erosion rates (in Whataroa catchment) are not situated at the range front but around 10 km into the mountain belt.
Resumo:
AIM To analyse meta-analyses included in systematic reviews (SRs) published in leading orthodontic journals and the Cochrane Database of Systematic Reviews (CDSR) focusing on orthodontic literature and to assess the quality of the existing evidence. MATERIALS AND METHODS Electronic searching was undertaken to identify SRs published in five major orthodontic journals and the CDSR between January 2000 and June 2014. Quality assessment of the overall body of evidence from meta-analyses was conducted using the Grading of Recommendations Assessment, Development and Evaluation working group (GRADE) tool. RESULTS One hundred and fifty-seven SRs were identified; meta-analysis was present in 43 of these (27.4 per cent). The highest proportion of SRs that included a meta-analysis was found in Orthodontics and Craniofacial Research (6/13; 46.1 per cent), followed by the CDSR (12/33; 36.4 per cent) and the American Journal of Orthodontics and Dentofacial Orthopaedics (15/44; 34.1 per cent). Class II treatment was the most commonly addressed topic within SRs in orthodontics (n = 18/157; 11.5 per cent). The number of trials combined to produce a summary estimate was small for most meta-analyses with a median of 4 (range: 2-52). Only 21 per cent (n = 9) of included meta-analyses were considered to have a high/moderate quality of evidence according to GRADE, while the majority were of low or very low quality (n = 34; 79.0 per cent). CONCLUSIONS Overall, approximately one quarter of orthodontic SRs included quantitative synthesis, with a median of four trials per meta-analysis. The overall quality of evidence from the selected orthodontic SRs was predominantly low to very low indicating the relative lack of high quality of evidence from SRs to inform clinical practice guidelines.
Resumo:
Reconstructing past modes of ocean circulation is an essential task in paleoclimatology and paleoceanography. To this end, we combine two sedimentary proxies, Nd isotopes (εNd) and the 231Pa/230Th ratio, both of which are not directly involved in the global carbon cycle, but allow the reconstruction of water mass provenance and provide information about the past strength of overturning circulation, respectively. In this study, combined 231Pa/230Th and εNd down-core profiles from six Atlantic Ocean sediment cores are presented. The data set is complemented by the two available combined data sets from the literature. From this we derive a comprehensive picture of spatial and temporal patterns and the dynamic changes of the Atlantic Meridional Overturning Circulation over the past ∼25 ka. Our results provide evidence for a consistent pattern of glacial/stadial advances of Southern Sourced Water along with a northward circulation mode for all cores in the deeper (>3000 m) Atlantic. Results from shallower core sites support an active overturning cell of shoaled Northern Sourced Water during the LGM and the subsequent deglaciation. Furthermore, we report evidence for a short-lived period of intensified AMOC in the early Holocene.
Resumo:
Public health departments play an important role in promoting and preserving the health of communities. The lack of a system to ensure their quality and accountability led to the development of a national voluntary accreditation program by Public Health Accreditation Board (PHAB). The concept that accreditation will lead to quality improvement in public health which will ultimately lead to healthy communities seems intuitive but lacks a robust body of evidence. A critical review of literature was conducted to explore if accreditation can lead to quality improvement in public health. The articles were selected from publically available databases using a specific set of criteria for inclusion, exclusion, and appraisal. To understand the relationship between accreditation and quality improvement, the potential strengths and limitations of accreditation process were evaluated. Recommendations for best practices are suggested so that public health accreditation can yield maximum benefits. A logic model framework to help depict the impact of accreditation on various levels of public health outcomes is also discussed in this thesis. The literature review shows that existing accreditation programs in other industries show limited but encouraging evidence that accreditation will improve quality and strengthen the delivery of public health services. While progress in introducing accreditation in public health can be informed by other accredited industries, the public health field has its own set of challenges. Providing incentives, creating financing strategies, and having a strong leadership will allow greater access to accreditation by all public health departments. The suggested recommendations include that continuous evaluation, public participation, systems approach, clear vision, and dynamic standards should become hallmarks of the accreditation process. Understanding the link between accreditation, quality improvement, and health outcomes will influence the successful adoption and implementation of the public health accreditation program. This review of literature suggests that accreditation is an important step in improving the quality of public health departments and in ultimately improving the health of communities. However, accreditation should be considered in an integrated system of tools and approaches to improve the public health practice. Hence, it is a means to an end - not an end unto itself.^
Resumo:
Many lines of clinical and experimental evidence indicate a viral role in carcinogenesis (1-6). Our access to patient plasma, serum, and tissue samples from invasive breast cancer (N=19), ductal carcinoma in situ (N=13), malignant ovarian cancer (N=12), and benign ovarian tumors (N=9), via IRB-approved and informed consent protocols through M.D. Anderson Cancer Center, as well as normal donor plasmas purchased from Gulf Coast Regional Blood Center (N=6), has allowed us to survey primary patient blood and tissue samples, healthy donor blood from the general population, as well as commercially available human cell lines for the presence of human endogenous retrovirus K (HERV-K) Env viral RNA (vRNA), protein, and viral particles. We hypothesize that HERV-K proteins are tumor-associated antigens and as such can be profiled and targeted in patients for diagnostic and therapeutic purposes. To test this hypothesis, we employed isopycnic ultracentrifugation, a microplate-based reverse transcriptase enzyme activity assay, reverse transcription – polymerase chain reaction (RT-PCR), cDNA sequencing, SDS-PAGE and western blotting, immunofluorescent staining, confocal microscopy, and transmission electron microscopy to evaluate v HERV-K activation in cancer. Data from large numbers of patients tested by reverse transcriptase activity assay were analyzed statistically by t-test to determine the potential use of this assay as a diagnostic tool for cancer. Significant reverse transcriptase enzyme activity was detected in 75% of ovarian cancer patients, 53.8% of ductal carcinoma in situ patient, and 42.1% of invasive breast cancer patient samples. Only 11.1% of benign ovarian patient and 16.7% of normal donor samples tested positive. HERV-K Env vRNA, or Env SU were detected in the majority of cancer types screened, as demonstrated by the results shown herein, and were largely absent in normal controls. These findings support our hypothesis that the presence of HERV-K in patient blood circulation is an indicator of cancer or pre-malignancy in vivo, that the presence of HERV-K Env on tumor cell surfaces is indicative of malignant phenotype, and that HERV-K Env is a tumor-associated antigen useful not only as a diagnostic screening tool to predict patient disease status, but also as an exploitable therapeutic target for various novel antibody-based immunotherapies.
Resumo:
A pressure core barrel (PCB), developed by the Deep Sea Drilling Project, was used successfully to recover, at in situ pressure, sediments of the Blake Outer Ridge, offshore the southeastern United States. The PCB is a unique, wire-line tool, 10.4 m long, capable of recovering 5.8 m of core (5.8 cm in diameter), maintained at or below in situ pressures of 34.4 million Pascals (MPa), and 1.8 m of unpressurized core (5.8 cm in diameter). All excess internal pressure above the operating pressure of 34.4 MPa is automatically vented off as the barrel is retrieved. The PCB was deployed five times at DSDP Site 533 where geophysical evidence suggests the presence of gas hydrates in the upper 600 m of sediment. Three cores were obtained holding average in situ pressures of 30 MPa. Two other cores did not maintain in situ pressures. Three of the five cores were intermittently degassed at varying intervals of time, and portions of the vented gas were collected for analysis. Pressure decline followed paths indicative of gas hydrates and/or dissolved gas. The released gas was dominantly methane (usually greater than 90%), along with higher molecular-weight hydrocarbon gases and carbon dioxide. During degassing the ratio of methane to ethane did not vary significantly. On the other hand, concentrations of higher molecular-weight hydrocarbon gases increased, as did carbon dioxide concentrations. The results from the PCB experiments provide tentative but equivocal evidence for the presence of .gas hydrates at Site 533. The amount of gas hydrate indicated is small. Nevertheless, this work represents the first successful study of marine gas hydrates utilizing the PCB.
Resumo:
Al, K, Sc and Ti concentrations of the terrestrial material-dominant sediments from ODP site 1144 were reported. Comparison between the bulk and the acid-leached sediments indicates that about 20~30% of the Al, K and Sc in the bulk sediments are not hosted in terrestrial detritus, rather they are of authigenic origin. However, authigenic Ti is negligible. The results indicate that Ti rather than Al is the best proxy for terrestrial materials. Significant climate controls are displayed in the Al/Ti, K/Ti and Sc/Ti variation patterns both for the bulk and the acid leached sediments. Such variation patterns can be mainly accounted for in terms of climate change in their provenance areas in South China. Elevated Al/Ti, K/Ti and Sc/Ti ratios during interglacial periods indicate that chemical weathering then was stronger than during glacial periods, which might be related to a more humid climate in interglacial periods.
Resumo:
Microinsurance is widely considered an important tool for sustainable poverty reduction, especially in the face of increasing climate risk. Although index-based microinsurance, which should be free from the classical incentive problems, has attracted considerable attention, uptake rates have generally been weak in low-income rural communities. We explore the purchase patterns of index-based livestock insurance in southern Ethiopia, focusing in particular on the role of accurate product comprehension and price, including the prospective impact of temporary discount coupons on subsequent period demand due to price anchoring effects. We find that randomly distributed learning kits contribute to improving subjects' knowledge of the products; however, we do not find strong evidence that the improved knowledge per se induces greater uptake. We also find that reduced price due to randomly distributed discount coupons has an immediate, positive impact on uptake, without dampening subsequent period demand due to reference-dependence associated with price anchoring effects.
Resumo:
Chinese agricultural cooperatives, called Farmer's Professional Cooperatives (FPCs), are expected to become a major tool to facilitate agro-industrialization for small farmers through the diffusion of new technologies, the supply of high-quality agricultural inputs and the marketing of their products. This study compares FPC participants with vegetable-producing non-participants and grain farmers in vegetable-producing areas in rural China to investigate the treatment effect of participation in FPCs as well as implementation of vegetable cultivation. I adopt parametric and nonparametric approaches to precisely estimate the treatment effects. Estimated results indicate no significant difference between participants and non-participants of FPCs on agricultural net income in both parametric and non-parametric estimations. In contrast, the comparison between vegetable and grain farmers using propensity score matching (PSM) reveals that the treatment effect of vegetable cultivation is significantly positive for total and agricultural incomes, although vegetable cultivation involves more labor-intensive efforts. These results indicate that it is the implementation of vegetable cultivation rather than the participation in an FPC that enhances the economic welfare of farmers, due to the non-excludability of FPCs' services as well as the risks involved in vegetable cultivation.
Resumo:
This thesis proposes how to apply the Semantic Web tech- nologies for the Idea Management Systems to deliver a solution to knowl- edge management and information over ow problems. Firstly, the aim is to present a model that introduces rich metadata annotations and their usage in the domain of Idea Management Systems. Furthermore, the the- sis shall investigate how to link innovation data with information from other systems and use it to categorize and lter out the most valuable elements. In addition, the thesis presents a Generic Idea and Innovation Management Ontology (Gi2MO) and aims to back its creation with a set of case studies followed by evaluations that prove how Semantic Web can work as tool to create new opportunities and leverage the contemporary Idea Management legacy systems into the next level.
Resumo:
OntoTag - A Linguistic and Ontological Annotation Model Suitable for the Semantic Web
1. INTRODUCTION. LINGUISTIC TOOLS AND ANNOTATIONS: THEIR LIGHTS AND SHADOWS
Computational Linguistics is already a consolidated research area. It builds upon the results of other two major ones, namely Linguistics and Computer Science and Engineering, and it aims at developing computational models of human language (or natural language, as it is termed in this area). Possibly, its most well-known applications are the different tools developed so far for processing human language, such as machine translation systems and speech recognizers or dictation programs.
These tools for processing human language are commonly referred to as linguistic tools. Apart from the examples mentioned above, there are also other types of linguistic tools that perhaps are not so well-known, but on which most of the other applications of Computational Linguistics are built. These other types of linguistic tools comprise POS taggers, natural language parsers and semantic taggers, amongst others. All of them can be termed linguistic annotation tools.
Linguistic annotation tools are important assets. In fact, POS and semantic taggers (and, to a lesser extent, also natural language parsers) have become critical resources for the computer applications that process natural language. Hence, any computer application that has to analyse a text automatically and ‘intelligently’ will include at least a module for POS tagging. The more an application needs to ‘understand’ the meaning of the text it processes, the more linguistic tools and/or modules it will incorporate and integrate.
However, linguistic annotation tools have still some limitations, which can be summarised as follows:
1. Normally, they perform annotations only at a certain linguistic level (that is, Morphology, Syntax, Semantics, etc.).
2. They usually introduce a certain rate of errors and ambiguities when tagging. This error rate ranges from 10 percent up to 50 percent of the units annotated for unrestricted, general texts.
3. Their annotations are most frequently formulated in terms of an annotation schema designed and implemented ad hoc.
A priori, it seems that the interoperation and the integration of several linguistic tools into an appropriate software architecture could most likely solve the limitations stated in (1). Besides, integrating several linguistic annotation tools and making them interoperate could also minimise the limitation stated in (2). Nevertheless, in the latter case, all these tools should produce annotations for a common level, which would have to be combined in order to correct their corresponding errors and inaccuracies. Yet, the limitation stated in (3) prevents both types of integration and interoperation from being easily achieved.
In addition, most high-level annotation tools rely on other lower-level annotation tools and their outputs to generate their own ones. For example, sense-tagging tools (operating at the semantic level) often use POS taggers (operating at a lower level, i.e., the morphosyntactic) to identify the grammatical category of the word or lexical unit they are annotating. Accordingly, if a faulty or inaccurate low-level annotation tool is to be used by other higher-level one in its process, the errors and inaccuracies of the former should be minimised in advance. Otherwise, these errors and inaccuracies would be transferred to (and even magnified in) the annotations of the high-level annotation tool.
Therefore, it would be quite useful to find a way to
(i) correct or, at least, reduce the errors and the inaccuracies of lower-level linguistic tools;
(ii) unify the annotation schemas of different linguistic annotation tools or, more generally speaking, make these tools (as well as their annotations) interoperate.
Clearly, solving (i) and (ii) should ease the automatic annotation of web pages by means of linguistic tools, and their transformation into Semantic Web pages (Berners-Lee, Hendler and Lassila, 2001). Yet, as stated above, (ii) is a type of interoperability problem. There again, ontologies (Gruber, 1993; Borst, 1997) have been successfully applied thus far to solve several interoperability problems. Hence, ontologies should help solve also the problems and limitations of linguistic annotation tools aforementioned.
Thus, to summarise, the main aim of the present work was to combine somehow these separated approaches, mechanisms and tools for annotation from Linguistics and Ontological Engineering (and the Semantic Web) in a sort of hybrid (linguistic and ontological) annotation model, suitable for both areas. This hybrid (semantic) annotation model should (a) benefit from the advances, models, techniques, mechanisms and tools of these two areas; (b) minimise (and even solve, when possible) some of the problems found in each of them; and (c) be suitable for the Semantic Web. The concrete goals that helped attain this aim are presented in the following section.
2. GOALS OF THE PRESENT WORK
As mentioned above, the main goal of this work was to specify a hybrid (that is, linguistically-motivated and ontology-based) model of annotation suitable for the Semantic Web (i.e. it had to produce a semantic annotation of web page contents). This entailed that the tags included in the annotations of the model had to (1) represent linguistic concepts (or linguistic categories, as they are termed in ISO/DCR (2008)), in order for this model to be linguistically-motivated; (2) be ontological terms (i.e., use an ontological vocabulary), in order for the model to be ontology-based; and (3) be structured (linked) as a collection of ontology-based
Resumo:
Visualization of program executions has been used in applications which include education and debugging. However, traditional visualization techniques often fall short of expectations or are altogether inadequate for new programming paradigms, such as Constraint Logic Programming (CLP), whose declarative and operational semantics differ in some crucial ways from those of other paradigms. In particular, traditional ideas regarding the behavior of data often cannot be lifted in a straightforward way to (C)LP from other families of programming languages. In this chapter we discuss techniques for visualizing data evolution in CLP. We briefly review some previously proposed visualization paradigms, and also propose a number of (to our knowledge) novel ones. The graphical representations have been chosen based on the perceived needs of a programmer trying to analyze the behavior and characteristics of an execution. In particular, we concéntrate on the representation of the run-time valúes of the variables, and the constraints among them. Given our interest in visualizing large executions, we also pay attention to abstraction techniques, i.e., techniques which are intended to help in reducing the complexity of the visual information.
Resumo:
Abstract Idea Management Systems are web applications that implement the notion of open innovation though crowdsourcing. Typically, organizations use those kind of systems to connect to large communities in order to gather ideas for improvement of products or services. Originating from simple suggestion boxes, Idea Management Systems advanced beyond collecting ideas and aspire to be a knowledge management solution capable to select best ideas via collaborative as well as expert assessment methods. In practice, however, the contemporary systems still face a number of problems usually related to information overflow and recognizing questionable quality of submissions with reasonable time and effort allocation. This thesis focuses on idea assessment problem area and contributes a number of solutions that allow to filter, compare and evaluate ideas submitted into an Idea Management System. With respect to Idea Management System interoperability the thesis proposes theoretical model of Idea Life Cycle and formalizes it as the Gi2MO ontology which enables to go beyond the boundaries of a single system to compare and assess innovation in an organization wide or market wide context. Furthermore, based on the ontology, the thesis builds a number of solutions for improving idea assessment via: community opinion analysis (MARL), annotation of idea characteristics (Gi2MO Types) and study of idea relationships (Gi2MO Links). The main achievements of the thesis are: application of theoretical innovation models for practice of Idea Management to successfully recognize the differentiation between communities, opinion metrics and their recognition as a new tool for idea assessment, discovery of new relationship types between ideas and their impact on idea clustering. Finally, the thesis outcome is establishment of Gi2MO Project that serves as an incubator for Idea Management solutions and mature open-source software alternatives for the widely available commercial suites. From the academic point of view the project delivers resources to undertake experiments in the Idea Management Systems area and managed to become a forum that gathered a number of academic and industrial partners. Resumen Los Sistemas de Gestión de Ideas son aplicaciones Web que implementan el concepto de innovación abierta con técnicas de crowdsourcing. Típicamente, las organizaciones utilizan ese tipo de sistemas para conectar con comunidades grandes y así recoger ideas sobre cómo mejorar productos o servicios. Los Sistemas de Gestión de Ideas lian avanzado más allá de recoger simplemente ideas de buzones de sugerencias y ahora aspiran ser una solución de gestión de conocimiento capaz de seleccionar las mejores ideas por medio de técnicas colaborativas, así como métodos de evaluación llevados a cabo por expertos. Sin embargo, en la práctica, los sistemas contemporáneos todavía se enfrentan a una serie de problemas, que, por lo general, están relacionados con la sobrecarga de información y el reconocimiento de las ideas de dudosa calidad con la asignación de un tiempo y un esfuerzo razonables. Esta tesis se centra en el área de la evaluación de ideas y aporta una serie de soluciones que permiten filtrar, comparar y evaluar las ideas publicadas en un Sistema de Gestión de Ideas. Con respecto a la interoperabilidad de los Sistemas de Gestión de Ideas, la tesis propone un modelo teórico del Ciclo de Vida de la Idea y lo formaliza como la ontología Gi2MO que permite ir más allá de los límites de un sistema único para comparar y evaluar la innovación en un contexto amplio dentro de cualquier organización o mercado. Por otra parte, basado en la ontología, la tesis desarrolla una serie de soluciones para mejorar la evaluación de las ideas a través de: análisis de las opiniones de la comunidad (MARL), la anotación de las características de las ideas (Gi2MO Types) y el estudio de las relaciones de las ideas (Gi2MO Links). Los logros principales de la tesis son: la aplicación de los modelos teóricos de innovación para la práctica de Sistemas de Gestión de Ideas para reconocer las diferenciasentre comu¬nidades, métricas de opiniones de comunidad y su reconocimiento como una nueva herramienta para la evaluación de ideas, el descubrimiento de nuevos tipos de relaciones entre ideas y su impacto en la agrupación de estas. Por último, el resultado de tesis es el establecimiento de proyecto Gi2MO que sirve como incubadora de soluciones para Gestión de Ideas y herramientas de código abierto ya maduras como alternativas a otros sistemas comerciales. Desde el punto de vista académico, el proyecto ha provisto de recursos a ciertos experimentos en el área de Sistemas de Gestión de Ideas y logró convertirse en un foro que reunión para un número de socios tanto académicos como industriales.