938 resultados para Optimistic data replication system
Resumo:
PURPOSE: To describe the implementation and use of an electronic patient-referral system as an aid to the efficient referral of patients to a remote and specialized treatment center. METHODS AND MATERIALS: A system for the exchange of radiotherapy data between different commercial planning systems and a specially developed planning system for proton therapy has been developed through the use of the PAPYRUS diagnostic image standard as an intermediate format. To ensure the cooperation of the different TPS manufacturers, the number of data sets defined for transfer has been restricted to the three core data sets of CT, VOIs, and three-dimensional dose distributions. As a complement to the exchange of data, network-wide application-sharing (video-conferencing) technologies have been adopted to provide methods for the interactive discussion and assessment of treatments plans with one or more partner clinics. RESULTS: Through the use of evaluation plans based on the exchanged data, referring clinics can accurately assess the advantages offered by proton therapy on a patient-by-patient basis, while the practicality or otherwise of the proposed treatments can simultaneously be assessed by the proton therapy center. Such a system, along with the interactive capabilities provided by video-conferencing methods, has been found to be an efficient solution to the problem of patient assessment and selection at a specialized treatment center, and is a necessary first step toward the full electronic integration of such centers with their remotely situated referral centers.
Resumo:
The primary challenge in groundwater and contaminant transport modeling is obtaining the data needed for constructing, calibrating and testing the models. Large amounts of data are necessary for describing the hydrostratigraphy in areas with complex geology. Increasingly states are making spatial data available that can be used for input to groundwater flow models. The appropriateness of this data for large-scale flow systems has not been tested. This study focuses on modeling a plume of 1,4-dioxane in a heterogeneous aquifer system in Scio Township, Washtenaw County, Michigan. The analysis consisted of: (1) characterization of hydrogeology of the area and construction of a conceptual model based on publicly available spatial data, (2) development and calibration of a regional flow model for the site, (3) conversion of the regional model to a more highly resolved local model, (4) simulation of the dioxane plume, and (5) evaluation of the model's ability to simulate field data and estimation of the possible dioxane sources and subsequent migration until maximum concentrations are at or below the Michigan Department of Environmental Quality's residential cleanup standard for groundwater (85 ppb). MODFLOW-2000 and MT3D programs were utilized to simulate the groundwater flow and the development and movement of the 1, 4-dioxane plume, respectively. MODFLOW simulates transient groundwater flow in a quasi-3-dimensional sense, subject to a variety of boundary conditions that can simulate recharge, pumping, and surface-/groundwater interactions. MT3D simulates solute advection with groundwater flow (using the flow solution from MODFLOW), dispersion, source/sink mixing, and chemical reaction of contaminants. This modeling approach was successful at simulating the groundwater flows by calibrating recharge and hydraulic conductivities. The plume transport was adequately simulated using literature dispersivity and sorption coefficients, although the plume geometries were not well constrained.
Resumo:
In the current market system, power systems are operated at higher loads for economic reasons. Power system stability becomes a genuine concern in such operating conditions. In case of failure of any larger component, the system may become stressed. These events may start cascading failures, which may lead to blackouts. One of the main reasons of the major recorded blackout events has been the unavailability of system-wide information. Synchrophasor technology has the capability to provide system-wide real time information. Phasor Measurement Units (PMUs) are the basic building block of this technology, which provide the Global Positioning System (GPS) time-stamped voltage and current phasor values along with the frequency. It is being assumed that synchrophasor data of all the buses is available and thus the whole system is fully observable. This information can be used to initiate islanding or system separation to avoid blackouts. A system separation strategy using synchrophasor data has been developed to answer the three main aspects of system separation: (1) When to separate: One class support machines (OC-SVM) is primarily used for the anomaly detection. Here OC-SVM was used to detect wide area instability. OC-SVM has been tested on different stable and unstable cases and it is found that OC-SVM has the capability to detect the wide area instability and thus is capable to answer the question of “when the system should be separated”. (2) Where to separate: The agglomerative clustering technique was used to find the groups of coherent buses. The lines connecting different groups of coherent buses form the separation surface. The rate of change of the bus voltage phase angles has been used as the input to this technique. This technique has the potential to exactly identify the lines to be tripped for the system separation. (3) What to do after separation: Load shedding was performed approximately equal to the sum of power flows along the candidate system separation lines should be initiated before tripping these lines. Therefore it is recommended that load shedding should be initiated before tripping the lines for system separation.
DIMENSION REDUCTION FOR POWER SYSTEM MODELING USING PCA METHODS CONSIDERING INCOMPLETE DATA READINGS
Resumo:
Principal Component Analysis (PCA) is a popular method for dimension reduction that can be used in many fields including data compression, image processing, exploratory data analysis, etc. However, traditional PCA method has several drawbacks, since the traditional PCA method is not efficient for dealing with high dimensional data and cannot be effectively applied to compute accurate enough principal components when handling relatively large portion of missing data. In this report, we propose to use EM-PCA method for dimension reduction of power system measurement with missing data, and provide a comparative study of traditional PCA and EM-PCA methods. Our extensive experimental results show that EM-PCA method is more effective and more accurate for dimension reduction of power system measurement data than traditional PCA method when dealing with large portion of missing data set.
Resumo:
The recent liberalization of the German energy market has forced the energy industry to develop and install new information systems to support agents on the energy trading floors in their analytical tasks. Besides classical approaches of building a data warehouse giving insight into the time series to understand market and pricing mechanisms, it is crucial to provide a variety of external data from the web. Weather information as well as political news or market rumors are relevant to give the appropriate interpretation to the variables of a volatile energy market. Starting from a multidimensional data model and a collection of buy and sell transactions a data warehouse is built that gives analytical support to the agents. Following the idea of web farming we harvest the web, match the external information sources after a filtering and evaluation process to the data warehouse objects, and present this qualified information on a user interface where market values are correlated with those external sources over the time axis.
Resumo:
Earth observations (EO) represent a growing and valuable resource for many scientific, research and practical applications carried out by users around the world. Access to EO data for some applications or activities, like climate change research or emergency response activities, becomes indispensable for their success. However, often EO data or products made of them are (or are claimed to be) subject to intellectual property law protection and are licensed under specific conditions regarding access and use. Restrictive conditions on data use can be prohibitive for further work with the data. Global Earth Observation System of Systems (GEOSS) is an initiative led by the Group on Earth Observations (GEO) with the aim to provide coordinated, comprehensive, and sustained EO and information for making informed decisions in various areas beneficial to societies, their functioning and development. It seeks to share data with users world-wide with the fewest possible restrictions on their use by implementing GEOSS Data Sharing Principles adopted by GEO. The Principles proclaim full and open exchange of data shared within GEOSS, while recognising relevant international instruments and national policies and legislation through which restrictions on the use of data may be imposed.The paper focuses on the issue of the legal interoperability of data that are shared with varying restrictions on use with the aim to explore the options of making data interoperable. The main question it addresses is whether the public domain or its equivalents represent the best mechanism to ensure legal interoperability of data. To this end, the paper analyses legal protection regimes and their norms applicable to EO data. Based on the findings, it highlights the existing public law statutory, regulatory, and policy approaches, as well as private law instruments, such as waivers, licenses and contracts, that may be used to place the datasets in the public domain, or otherwise make them publicly available for use and re-use without restrictions. It uses GEOSS and the particular characteristics of it as a system to identify the ways to reconcile the vast possibilities it provides through sharing of data from various sources and jurisdictions on the one hand, and the restrictions on the use of the shared resources on the other. On a more general level the paper seeks to draw attention to the obstacles and potential regulatory solutions for sharing factual or research data for the purposes that go beyond research and education.
Resumo:
AIMS This study's objective is to assess the safety of non-therapeutic atomoxetine exposures reported to the US National Poison Database System (NPDS). METHODS This is a retrospective database study of non-therapeutic single agent ingestions of atomoxetine in children and adults reported to the NPDS between 2002 and 2010. RESULTS A total of 20 032 atomoxetine exposures were reported during the study period, and 12 370 of these were single agent exposures. The median age was 9 years (interquartile range 3, 14), and 7380 were male (59.7%). Of the single agent exposures, 8813 (71.2%) were acute exposures, 3315 (26.8%) were acute-on-chronic, and 166 (1.3%) were chronic. In 10 608 (85.8%) cases, exposure was unintentional, in 1079 (8.7%) suicide attempts, and in 629 (5.1%) cases abuse. Of these cases, 3633 (29.4 %) were managed at health-care facilities. Acute-on-chronic exposure was associated with an increased risk of a suicidal reason for exposure compared with acute ingestions (odds ratio 1.44, 95% confidence interval 1.26-1.65). Most common clinical effects were drowsiness or lethargy (709 cases; 5.7%), tachycardia (555; 4.5%), and nausea (388; 3.1%). Major toxicity was observed in 21 cases (seizures in nine (42.9%), tachycardia in eight (38.1%), coma in six (28.6%), and ventricular dysrhythmia in one case (4.8%)). CONCLUSIONS Non-therapeutic atomoxetine exposures were largely safe, but seizures were rarely observed.
Resumo:
Systemic immune activation, a major determinant of HIV disease progression, is the result of a complex interplay between viral replication, dysregulation of the immune system, and microbial translocation due to gut mucosal damage. While human genetic variants influencing HIV viral load have been identified, it is unknown to what extent the host genetic background contributes to inter-individual differences in other determinants of HIV pathogenesis like gut damage and microbial translocation. Using samples and data from 717 untreated participants in the Swiss HIV Cohort Study and a genome-wide association study design, we searched for human genetic determinants of plasma levels of intestinal fatty-acid binding protein (I-FABP/FABP2), a marker of gut damage, and of soluble sCD14 (sCD14), a marker of LPS bioactivity and microbial translocation. We also assessed the correlations between HIV viral load, sCD14 and I-FABP. While we found no genome-wide significant determinant of the tested plasma markers, we observed strong associations between sCD14 and both HIV viral load and I-FABP, shedding new light on the relationships between processes that drive progression of untreated HIV infection.
Resumo:
BACKGROUND: We evaluated Swiss slaughterhouse data for integration in a national syndromic surveillance system for the early detection of emerging diseases in production animals. We analysed meat inspection data for cattle, pigs and small ruminants slaughtered between 2007 and 2012 (including emergency slaughters of sick/injured animals); investigating patterns in the number of animals slaughtered and condemned; the reasons invoked for whole carcass condemnations; reporting biases and regional effects. RESULTS: Whole carcass condemnation rates were fairly uniform (1-2‰) over time and between the different types of production animals. Condemnation rates were much higher and less uniform following emergency slaughters. The number of condemnations peaked in December for both cattle and pigs, a time when individuals of lower quality are sent to slaughter when hay and food are limited and when certain diseases are more prevalent. Each type of production animal was associated with a different profile of condemnation reasons. The most commonly reported one was "severe lesions" for cattle, "abscesses" for pigs and "pronounced weight loss" for small ruminants. These reasons could constitute valuable syndromic indicators as they are unspecific clinical manifestations of a large range of animal diseases (as well as potential indicators of animal welfare). Differences were detected in the rate of carcass condemnation between cantons and between large and small slaughterhouses. A large percentage (>60% for all three animal categories) of slaughterhouses operating never reported a condemnation between 2007 and 2012, a potential indicator of widespread non-reporting bias in our database. CONCLUSIONS: The current system offers simultaneous coverage of cattle, pigs and small ruminants for the whole of Switzerland; and traceability of each condemnation to its farm of origin. The number of condemnations was significantly linked to the number of slaughters, meaning that the former should be always be offset by the later in analyses. Because this denominator is only communicated at the end of the month, condemnations may currently only be monitored on a monthly basis. Coupled with the lack of timeliness (30-60 days delay between condemnation and notification), this limits the use of the data for early-detection.
Resumo:
Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.