966 resultados para Data quality-aware mechanisms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation presented to obtain a Doctoral Degree in Biology by Instituto de Tecnologia Química e Biológica, Universidade Nova de Lisboa

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The MAP-i Doctoral Programme in Informatics, of the Universities of Minho, Aveiro and Porto

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sleep spindles are synchronized 11-15 Hz electroencephalographic (EEG) oscillations predominant during nonrapid-eye-movement sleep (NREMS). Rhythmic bursting in the reticular thalamic nucleus (nRt), arising from interplay between Ca(v)3.3-type Ca(2+) channels and Ca(2+)-dependent small-conductance-type 2 (SK2) K(+) channels, underlies spindle generation. Correlative evidence indicates that spindles contribute to memory consolidation and protection against environmental noise in human NREMS. Here, we describe a molecular mechanism through which spindle power is selectively extended and we probed the actions of intensified spindling in the naturally sleeping mouse. Using electrophysiological recordings in acute brain slices from SK2 channel-overexpressing (SK2-OE) mice, we found that nRt bursting was potentiated and thalamic circuit oscillations were prolonged. Moreover, nRt cells showed greater resilience to transit from burst to tonic discharge in response to gradual depolarization, mimicking transitions out of NREMS. Compared with wild-type littermates, chronic EEG recordings of SK2-OE mice contained less fragmented NREMS, while the NREMS EEG power spectrum was conserved. Furthermore, EEG spindle activity was prolonged at NREMS exit. Finally, when exposed to white noise, SK2-OE mice needed stronger stimuli to arouse. Increased nRt bursting thus strengthens spindles and improves sleep quality through mechanisms independent of EEG slow waves (<4 Hz), suggesting SK2 signaling as a new potential therapeutic target for sleep disorders and for neuropsychiatric diseases accompanied by weakened sleep spindles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neurocritical care depends, in part, on careful patient monitoring but as yet there are little data on what processes are the most important to monitor, how these should be monitored, and whether monitoring these processes is cost-effective and impacts outcome. At the same time, bioinformatics is a rapidly emerging field in critical care but as yet there is little agreement or standardization on what information is important and how it should be displayed and analyzed. The Neurocritical Care Society in collaboration with the European Society of Intensive Care Medicine, the Society for Critical Care Medicine, and the Latin America Brain Injury Consortium organized an international, multidisciplinary consensus conference to begin to address these needs. International experts from neurosurgery, neurocritical care, neurology, critical care, neuroanesthesiology, nursing, pharmacy, and informatics were recruited on the basis of their research, publication record, and expertise. They undertook a systematic literature review to develop recommendations about specific topics on physiologic processes important to the care of patients with disorders that require neurocritical care. This review does not make recommendations about treatment, imaging, and intraoperative monitoring. A multidisciplinary jury, selected for their expertise in clinical investigation and development of practice guidelines, guided this process. The GRADE system was used to develop recommendations based on literature review, discussion, integrating the literature with the participants' collective experience, and critical review by an impartial jury. Emphasis was placed on the principle that recommendations should be based on both data quality and on trade-offs and translation into clinical practice. Strong consideration was given to providing pragmatic guidance and recommendations for bedside neuromonitoring, even in the absence of high quality data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The remit of the Institute of Public Health in Ireland (IPH) is to promote cooperation for public health between Northern Ireland and the Republic of Ireland in the areas of research and information, capacity building and policy advice. Our approach is to support Departments of Health and their agencies in both jurisdictions, and maximise the benefits of all-island cooperation to achieve practical benefits for people in Northern Ireland and the Republic of Ireland. IPH have previously responded to consultations to the Department of Health’s Discussion Paper on the Proposed Health Information Bill (June 2008), the Health Information and Quality Authority on their Corporate Plan (Oct 2007), and the Road Safety Authority of Ireland Road Safety Strategy (Jul 2012). IPH supports the development of a national standard demographic dataset for use within the health and social care services. Provided necessary safeguards are put in place (such as ethics and data protection) and the purpose of collecting the information is fully explained to subjects, mandatory provision of a minimum demographic dataset is usually the best way to achieve the necessary coverage and data quality. Demographic information is needed in several forms to support the public health function: Detailed aggregated information for comparison to population counts in order to assess equity of access to healthcare as well as examining population patterns and trends in morbidity and mortality Accurate demographic information for the surveillance of infectious disease outbreaks, monitoring vaccination programmes, setting priorities for public health interventions Linked to other data outside of health and social care such as population data, survey data, and longitudinal studies for research and analysis purposes.   Identify and address public health issues to tackle health inequalities, and to monitor the success of such efforts to tackle them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EHLASS survey was set up in April 1986 as a five-year demonstration project. The objective was to monitor home and leisure accidents in a harmonised manner, throughout the EU, to determine their causes, the circumstances of their occurrence, their consequences and, most importantly, to provide information on consumer products involved. Armed with accurate information, it was felt that consumer policy could be directed at the most serious problems andthe best use could be made of available resources.   Data collection systems were set up for the collection of EHLASS data in the casualty departments of selected hospitals in each of the member states. The information was subsequently gathered together by the European Commission in Brussels. Extensive analysis was undertaken on 778,838 accidents reported throughout the EU. Centralised analysis of EHLASS data proved problematic due to lack of  co-ordination in data quality. In 1989 it was decided that each member state should  produce its own annual EHLASS report in a harmonised format specified by the European Commission. This report is the ninth such report for Ireland. Download the Report here

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The report provides analysis of PCT participation levels and investigates data quality issues in the collection of the 2007/08 NCMP dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El PFC té com objectiu dissenyar i implementar un esquema criptogràfic per a garantir un accéssegur a les dades proporcionant mecanismes per salvaguardar la confidencialitat, autenticitat i integritat de les dades i no repudi de les accions portades a terme pels usuaris.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Astrocytes have recently become a major center of interest in neurochemistry with the discoveries on their major role in brain energy metabolism. An interesting way to probe this glial contribution is given by in vivo (13) C NMR spectroscopy coupled with the infusion labeled glial-specific substrate, such as acetate. In this study, we infused alpha-chloralose anesthetized rats with [2-(13) C]acetate and followed the dynamics of the fractional enrichment (FE) in the positions C4 and C3 of glutamate and glutamine with high sensitivity, using (1) H-[(13) C] magnetic resonance spectroscopy (MRS) at 14.1T. Applying a two-compartment mathematical model to the measured time courses yielded a glial tricarboxylic acid (TCA) cycle rate (Vg ) of 0.27 ± 0.02 μmol/g/min and a glutamatergic neurotransmission rate (VNT ) of 0.15 ± 0.01 μmol/g/min. Glial oxidative ATP metabolism thus accounts for 38% of total oxidative metabolism measured by NMR. Pyruvate carboxylase (VPC ) was 0.09 ± 0.01 μmol/g/min, corresponding to 37% of the glial glutamine synthesis rate. The glial and neuronal transmitochondrial fluxes (Vx (g) and Vx (n) ) were of the same order of magnitude as the respective TCA cycle fluxes. In addition, we estimated a glial glutamate pool size of 0.6 ± 0.1 μmol/g. The effect of spectral data quality on the fluxes estimates was analyzed by Monte Carlo simulations. In this (13) C-acetate labeling study, we propose a refined two-compartment analysis of brain energy metabolism based on (13) C turnover curves of acetate, glutamate and glutamine measured with state of the art in vivo dynamic MRS at high magnetic field in rats, enabling a deeper understanding of the specific role of glial cells in brain oxidative metabolism. In addition, the robustness of the metabolic fluxes determination relative to MRS data quality was carefully studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The simultaneous recording of scalp electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) can provide unique insights into the dynamics of human brain function, and the increased functional sensitivity offered by ultra-high field fMRI opens exciting perspectives for the future of this multimodal approach. However, simultaneous recordings are susceptible to various types of artifacts, many of which scale with magnetic field strength and can seriously compromise both EEG and fMRI data quality in recordings above 3T. The aim of the present study was to implement and characterize an optimized setup for simultaneous EEG-fMRI in humans at 7T. The effects of EEG cable length and geometry for signal transmission between the cap and amplifiers were assessed in a phantom model, with specific attention to noise contributions from the MR scanner coldheads. Cable shortening (down to 12cm from cap to amplifiers) and bundling effectively reduced environment noise by up to 84% in average power and 91% in inter-channel power variability. Subject safety was assessed and confirmed via numerical simulations of RF power distribution and temperature measurements on a phantom model, building on the limited existing literature at ultra-high field. MRI data degradation effects due to the EEG system were characterized via B0 and B1(+) field mapping on a human volunteer, demonstrating important, although not prohibitive, B1 disruption effects. With the optimized setup, simultaneous EEG-fMRI acquisitions were performed on 5 healthy volunteers undergoing two visual paradigms: an eyes-open/eyes-closed task, and a visual evoked potential (VEP) paradigm using reversing-checkerboard stimulation. EEG data exhibited clear occipital alpha modulation and average VEPs, respectively, with concomitant BOLD signal changes. On a single-trial level, alpha power variations could be observed with relative confidence on all trials; VEP detection was more limited, although statistically significant responses could be detected in more than 50% of trials for every subject. Overall, we conclude that the proposed setup is well suited for simultaneous EEG-fMRI at 7T.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PRINCIPLES: International guidelines for heart failure (HF) care recommend the implementation of inter-professional disease management programmes. To date, no such programme has been tested in Switzerland. The aim of this randomised controlled trial (RCT) was to test the effect on hospitalisation, mortality and quality of life of an adult ambulatory disease management programme for patients with HF in Switzerland.METHODS: Consecutive patients admitted to internal medicine in a Swiss university hospital were screened for decompensated HF. A total of 42 eligible patients were randomised to an intervention (n = 22) or usual care group (n = 20). Medical treatment was optimised and lifestyle recommendations were given to all patients. Intervention patients additionally received a home visit by a HF-nurse, followed by 17 telephone calls of decreasing frequency over 12 months, focusing on self-care. Calls from the HF nurse to primary care physicians communicated health concerns and identified goals of care. Data were collected at baseline, 3, 6, 9 and 12 months. Mixed regression analysis (quality of life) was used. Outcome assessment was conducted by researchers blinded to group assignment.RESULTS: After 12 months, 22 (52%) patients had an all-cause re-admission or died. Only 3 patients were hospitalised with HF decompensation. No significant effect of the intervention was found on HF related to quality of life.CONCLUSIONS: An inter-professional disease management programme is possible in the Swiss healthcare setting but effects on outcomes need to be confirmed in larger studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.