951 resultados para Spatial Data Quality
Resumo:
The MAP-i Doctoral Programme in Informatics, of the Universities of Minho, Aveiro and Porto
Resumo:
O presente trabalho identificou e mapeou as áreas inundáveis da Bacia do Rio Luís Alves. A bacia, de uso da terra predominantemente agrícola, possui um notável histórico de inundações, demandando estudos que subsidiem o planejamento e a gestão do território. A partir da coleta de dados pluviométricos e fluviométricos, associada ao processamento de dados espaciais, foi desenvolvida uma metodologia em ambiente de SIG, que possibilitou a simulação de eventos de inundação, bem como a determinação de áreas de risco.
Resumo:
The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.
Resumo:
L'explosió d'aplicacions a Internet basades en oferir serveis de localització, com són portals web de mobilitat o aplicacions de seguiment de vehicles en línia han motivat aquest projecte. Google Maps ens permet afegir mapes fàcilment en un lloc web amb la seva API, però OpenLayers, una llibreria JavaScript lliure ens dóna l'opció de carregar cobertures de mapa i marcadors des de qualsevol font. OpenStreetMap proporciona dades geogràfiques de manera lliure, com per exemple mapes de carrers i carreteres. Un estudi acurat de l'estructura i agrupació de les dades en el format OSM i el desenvolupament d'un servidor basat en el model de tessel·les, són els principals elements de partida per crear la nostra pròpia font de dades cartogràfiques. En aquest projecte s'analitza i processa DXF Drawing eXchange Format passant al format OSM. Un fitxer OSM conté la informació geogràfica necessària per a la base de dades espaial a partir de la qual, entre d'altres aplicacions, es podran visualitzar els mapes propis en una aplicació de seguiment de vehicles o en un portal web.
Resumo:
Neurocritical care depends, in part, on careful patient monitoring but as yet there are little data on what processes are the most important to monitor, how these should be monitored, and whether monitoring these processes is cost-effective and impacts outcome. At the same time, bioinformatics is a rapidly emerging field in critical care but as yet there is little agreement or standardization on what information is important and how it should be displayed and analyzed. The Neurocritical Care Society in collaboration with the European Society of Intensive Care Medicine, the Society for Critical Care Medicine, and the Latin America Brain Injury Consortium organized an international, multidisciplinary consensus conference to begin to address these needs. International experts from neurosurgery, neurocritical care, neurology, critical care, neuroanesthesiology, nursing, pharmacy, and informatics were recruited on the basis of their research, publication record, and expertise. They undertook a systematic literature review to develop recommendations about specific topics on physiologic processes important to the care of patients with disorders that require neurocritical care. This review does not make recommendations about treatment, imaging, and intraoperative monitoring. A multidisciplinary jury, selected for their expertise in clinical investigation and development of practice guidelines, guided this process. The GRADE system was used to develop recommendations based on literature review, discussion, integrating the literature with the participants' collective experience, and critical review by an impartial jury. Emphasis was placed on the principle that recommendations should be based on both data quality and on trade-offs and translation into clinical practice. Strong consideration was given to providing pragmatic guidance and recommendations for bedside neuromonitoring, even in the absence of high quality data.
Resumo:
The remit of the Institute of Public Health in Ireland (IPH) is to promote cooperation for public health between Northern Ireland and the Republic of Ireland in the areas of research and information, capacity building and policy advice. Our approach is to support Departments of Health and their agencies in both jurisdictions, and maximise the benefits of all-island cooperation to achieve practical benefits for people in Northern Ireland and the Republic of Ireland. IPH have previously responded to consultations to the Department of Health’s Discussion Paper on the Proposed Health Information Bill (June 2008), the Health Information and Quality Authority on their Corporate Plan (Oct 2007), and the Road Safety Authority of Ireland Road Safety Strategy (Jul 2012). IPH supports the development of a national standard demographic dataset for use within the health and social care services. Provided necessary safeguards are put in place (such as ethics and data protection) and the purpose of collecting the information is fully explained to subjects, mandatory provision of a minimum demographic dataset is usually the best way to achieve the necessary coverage and data quality. Demographic information is needed in several forms to support the public health function: Detailed aggregated information for comparison to population counts in order to assess equity of access to healthcare as well as examining population patterns and trends in morbidity and mortality Accurate demographic information for the surveillance of infectious disease outbreaks, monitoring vaccination programmes, setting priorities for public health interventions Linked to other data outside of health and social care such as population data, survey data, and longitudinal studies for research and analysis purposes. Identify and address public health issues to tackle health inequalities, and to monitor the success of such efforts to tackle them.
Resumo:
The EHLASS survey was set up in April 1986 as a five-year demonstration project. The objective was to monitor home and leisure accidents in a harmonised manner, throughout the EU, to determine their causes, the circumstances of their occurrence, their consequences and, most importantly, to provide information on consumer products involved. Armed with accurate information, it was felt that consumer policy could be directed at the most serious problems andthe best use could be made of available resources.  Data collection systems were set up for the collection of EHLASS data in the casualty departments of selected hospitals in each of the member states. The information was subsequently gathered together by the European Commission in Brussels. Extensive analysis was undertaken on 778,838 accidents reported throughout the EU. Centralised analysis of EHLASS data proved problematic due to lack of co-ordination in data quality. In 1989 it was decided that each member state should produce its own annual EHLASS report in a harmonised format specified by the European Commission. This report is the ninth such report for Ireland. Download the Report here
Resumo:
L'objectiu és una aplicació que permeti realitzar el càlcul del volum de terres disponibles en el subsòl d'un àrea seleccionada. L'objectiu final del projecte serà crear un Sistema d'Informació Geogràfica SIG que ajudi a valorar quines parcel¿les de l'àrea seleccionada són les que disposen de més volum de terres per iniciar la seva explotació. Per a això, es disposa del programari gvSIG i les seves extensions (SEXTANTE) i de tota la informació que es pugui obtenir sobre els SIG, Cartografia, Geodèsia... Per dur a terme aquest projecte es necessita tenir experiència en Bases de dades, Programació Orientada a Objectes i seria recomanable tenir coneixements sobre Enginyeria del Programador. El projecte se centrarà en la utilització de gvSIG, com un exemple concret de programari SIG de lliure accés, solució desenvolupada per la ¿Conselleria d%o2019Obris Publiquis de la Generalitat Valenciana¿. Una part d'aquest projecte consistirà a avaluar aquest programari. El resultat final serà l'obtenció dels coneixements necessaris per poder treballar amb dades espacials a més d'una aplicació SIG per al càlcul del volum de terres d'un àrea seleccionada.
Resumo:
The report provides analysis of PCT participation levels and investigates data quality issues in the collection of the 2007/08 NCMP dataset.
Resumo:
Astrocytes have recently become a major center of interest in neurochemistry with the discoveries on their major role in brain energy metabolism. An interesting way to probe this glial contribution is given by in vivo (13) C NMR spectroscopy coupled with the infusion labeled glial-specific substrate, such as acetate. In this study, we infused alpha-chloralose anesthetized rats with [2-(13) C]acetate and followed the dynamics of the fractional enrichment (FE) in the positions C4 and C3 of glutamate and glutamine with high sensitivity, using (1) H-[(13) C] magnetic resonance spectroscopy (MRS) at 14.1T. Applying a two-compartment mathematical model to the measured time courses yielded a glial tricarboxylic acid (TCA) cycle rate (Vg ) of 0.27 ± 0.02 μmol/g/min and a glutamatergic neurotransmission rate (VNT ) of 0.15 ± 0.01 μmol/g/min. Glial oxidative ATP metabolism thus accounts for 38% of total oxidative metabolism measured by NMR. Pyruvate carboxylase (VPC ) was 0.09 ± 0.01 μmol/g/min, corresponding to 37% of the glial glutamine synthesis rate. The glial and neuronal transmitochondrial fluxes (Vx (g) and Vx (n) ) were of the same order of magnitude as the respective TCA cycle fluxes. In addition, we estimated a glial glutamate pool size of 0.6 ± 0.1 μmol/g. The effect of spectral data quality on the fluxes estimates was analyzed by Monte Carlo simulations. In this (13) C-acetate labeling study, we propose a refined two-compartment analysis of brain energy metabolism based on (13) C turnover curves of acetate, glutamate and glutamine measured with state of the art in vivo dynamic MRS at high magnetic field in rats, enabling a deeper understanding of the specific role of glial cells in brain oxidative metabolism. In addition, the robustness of the metabolic fluxes determination relative to MRS data quality was carefully studied.
Resumo:
The simultaneous recording of scalp electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) can provide unique insights into the dynamics of human brain function, and the increased functional sensitivity offered by ultra-high field fMRI opens exciting perspectives for the future of this multimodal approach. However, simultaneous recordings are susceptible to various types of artifacts, many of which scale with magnetic field strength and can seriously compromise both EEG and fMRI data quality in recordings above 3T. The aim of the present study was to implement and characterize an optimized setup for simultaneous EEG-fMRI in humans at 7T. The effects of EEG cable length and geometry for signal transmission between the cap and amplifiers were assessed in a phantom model, with specific attention to noise contributions from the MR scanner coldheads. Cable shortening (down to 12cm from cap to amplifiers) and bundling effectively reduced environment noise by up to 84% in average power and 91% in inter-channel power variability. Subject safety was assessed and confirmed via numerical simulations of RF power distribution and temperature measurements on a phantom model, building on the limited existing literature at ultra-high field. MRI data degradation effects due to the EEG system were characterized via B0 and B1(+) field mapping on a human volunteer, demonstrating important, although not prohibitive, B1 disruption effects. With the optimized setup, simultaneous EEG-fMRI acquisitions were performed on 5 healthy volunteers undergoing two visual paradigms: an eyes-open/eyes-closed task, and a visual evoked potential (VEP) paradigm using reversing-checkerboard stimulation. EEG data exhibited clear occipital alpha modulation and average VEPs, respectively, with concomitant BOLD signal changes. On a single-trial level, alpha power variations could be observed with relative confidence on all trials; VEP detection was more limited, although statistically significant responses could be detected in more than 50% of trials for every subject. Overall, we conclude that the proposed setup is well suited for simultaneous EEG-fMRI at 7T.
Resumo:
PRINCIPLES: International guidelines for heart failure (HF) care recommend the implementation of inter-professional disease management programmes. To date, no such programme has been tested in Switzerland. The aim of this randomised controlled trial (RCT) was to test the effect on hospitalisation, mortality and quality of life of an adult ambulatory disease management programme for patients with HF in Switzerland.METHODS: Consecutive patients admitted to internal medicine in a Swiss university hospital were screened for decompensated HF. A total of 42 eligible patients were randomised to an intervention (n = 22) or usual care group (n = 20). Medical treatment was optimised and lifestyle recommendations were given to all patients. Intervention patients additionally received a home visit by a HF-nurse, followed by 17 telephone calls of decreasing frequency over 12 months, focusing on self-care. Calls from the HF nurse to primary care physicians communicated health concerns and identified goals of care. Data were collected at baseline, 3, 6, 9 and 12 months. Mixed regression analysis (quality of life) was used. Outcome assessment was conducted by researchers blinded to group assignment.RESULTS: After 12 months, 22 (52%) patients had an all-cause re-admission or died. Only 3 patients were hospitalised with HF decompensation. No significant effect of the intervention was found on HF related to quality of life.CONCLUSIONS: An inter-professional disease management programme is possible in the Swiss healthcare setting but effects on outcomes need to be confirmed in larger studies.
Resumo:
Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.
Resumo:
The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.
Resumo:
This final year project presents the design principles and prototype implementation of BIMS (Biomedical Information Management System), a flexible software system which provides an infrastructure to manage all information required by biomedical research projects.The BIMS project was initiated with the motivation to solve several limitations in medical data acquisition of some research projects, in which Universitat Pompeu Fabra takes part. These limitations,based on the lack of control mechanisms to constraint information submitted by clinicians, impact on the data quality, decreasing it.BIMS can easily be adapted to manage information of a wide variety of clinical studies, not being limited to a given clinical specialty. The software can manage both, textual information, like clinical data (measurements, demographics, diagnostics, etc ...), as well as several kinds of medical images (magnetic resonance imaging, computed tomography, etc ...). Moreover, BIMS provides a web - based graphical user interface and is designed to be deployed in a distributed andmultiuser environment. It is built on top of open source software products and frameworks.Specifically, BIMS has been used to represent all clinical data being currently used within the CardioLab platform (an ongoing project managed by Universitat Pompeu Fabra), demonstratingthat it is a solid software system, which could fulfill requirements of a real production environment.