985 resultados para real time information


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adverse drug events are one of the major causes of morbidity in developed countries, yet the drugs involved in these events have been trialled and approved on the basis of randomised controlled trials (RCTs), regarded as the study design that will produce the best evidence.

Though the focus on adverse drug events has been primarily on processes and outcomes associated with the use of these approved drugs, attention needs to be directed to the way in which the RCT study design is structured. The implementation of controls to achieve internal validity in RCTs may be the very controls that reduce external validity, and contribute to the levels of adverse drug events associated with the release of a new drug to the wider patient population.

An examination of these controls, and the effects they can have on patient safety, underscore the importance of knowing about how the clinical trials of a drug are undertaken, rather than relying only on the recorded outcomes.

As the majority of new drugs are likely to be prescribed to older patients who have one or more comorbidities in addition to that targeted by a new drug, and as the RCTs of those drugs typically under-represent the elderly and exclude patients with multiple comorbidities, timely assessment of drug safety signals is essential.

It is unlikely that regulatory jurisdictions will undertake a reassessment of safety issues for drugs that are already approved. Instead, reliance has been placed on adverse drug event reporting systems. Such systems have a very low reporting rate, and most adverse drug events remain unreported, to the eventual cost to patients and healthcare systems.

This makes it essential for near real-time systems that can pick up safety signals as they occur, so that modifications to the product information (or removal of the drug) can be implemented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change is perhaps the most pressing and urgent environmental issue facing the world today. However our ability to predict and quantify the consequences of this change is severely limited by the paucity of in situ oceanographic measurements. Marine animals equipped with sophisticated oceanographic data loggers to study their behavior offer one solution to this problem because marine animals range widely across the world’s ocean basins and visit remote and often inaccessible locations. However, unlike the information being collected from conventional oceanographic sensing equipment, which has been validated, the data collected from instruments deployed on marine animals over long periods has not. This is the first long-term study to validate in situ oceanographic data collected by animal oceanographers. We compared the ocean temperatures collected by leatherback turtles (Dermochelys coriacea) in the Atlantic Ocean with the ARGO network of ocean floats and could find no systematic errors that could be ascribed to sensor instability. Animal-borne sensors allowed water temperature to be monitored across a range of depths, over entire ocean basins, and, importantly, over long periods and so will play a key role in assessing global climate change through improved monitoring of global temperatures. This finding is especially pertinent given recent international calls for the development and implementation of a comprehensive Earth observation system (see http://iwgeo.ssc.nasa.gov/documents.asp?s=review) that includes the use of novel techniques for monitoring and understanding ocean and climate interactions to address strategic environmental and societal needs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drinking water distribution networks risk exposure to malicious or accidental contamination. Several levels of responses are conceivable. One of them consists to install a sensor network to monitor the system on real time. Once a contamination has been detected, this is also important to take appropriate counter-measures. In the SMaRT-OnlineWDN project, this relies on modeling to predict both hydraulics and water quality. An online model use makes identification of the contaminant source and simulation of the contaminated area possible. The objective of this paper is to present SMaRT-OnlineWDN experience and research results for hydraulic state estimation with sampling frequency of few minutes. A least squares problem with bound constraints is formulated to adjust demand class coefficient to best fit the observed values at a given time. The criterion is a Huber function to limit the influence of outliers. A Tikhonov regularization is introduced for consideration of prior information on the parameter vector. Then the Levenberg-Marquardt algorithm is applied that use derivative information for limiting the number of iterations. Confidence intervals for the state prediction are also given. The results are presented and discussed on real networks in France and Germany.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tensor3D is a geometric modeling program with the capacity to simulate and visualize in real-time the deformation, specified through a tensor matrix and applied to triangulated models representing geological bodies. 3D visualization allows the study of deformational processes that are traditionally conducted in 2D, such as simple and pure shears. Besides geometric objects that are immediately available in the program window, the program can read other models from disk, thus being able to import objects created with different open-source or proprietary programs. A strain ellipsoid and a bounding box are simultaneously shown and instantly deformed with the main object. The principal axes of strain are visualized as well to provide graphical information about the orientation of the tensor's normal components. The deformed models can also be saved, retrieved later and deformed again, in order to study different steps of progressive strain, or to make this data available to other programs. The shape of stress ellipsoids and the corresponding Mohr circles defined by any stress tensor can also be represented. The application was written using the Visualization ToolKit, a powerful scientific visualization library in the public domain. This development choice, allied to the use of the Tcl/Tk programming language, which is independent on the host computational platform, makes the program a useful tool for the study of geometric deformations directly in three dimensions in teaching as well as research activities. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a technique for real-time crowd density estimation based on textures of crowd images. In this technique, the current image from a sequence of input images is classified into a crowd density class. Then, the classification is corrected by a low-pass filter based on the crowd density classification of the last n images of the input sequence. The technique obtained 73.89% of correct classification in a real-time application on a sequence of 9892 crowd images. Distributed processing was used in order to obtain real-time performance. © Springer-Verlag Berlin Heidelberg 2005.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper adjusts decentralized OPF optimization to the AC power flow problem in power systems with interconnected areas operated by diferent transmission system operators (TSO). The proposed methodology allows finding the operation point of a particular area without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. The methodology is based on the decomposition of the first-order optimality conditions of the AC power flow, which is formulated as a nonlinear programming problem. To allow better visualization of the concept of independent operation of each TSO, an artificial neural network have been used for computing border information of the interconnected TSOs. A multi-area Power Flow tool can be seen as a basic building block able to address a large number of problems under a multi-TSO competitive market philosophy. The IEEE RTS-96 power system is used in order to show the operation and effectiveness of the decentralized AC Power Flow. ©2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motion control is a sub-field of automation, in which the position and/or velocity of machines are controlled using some type of device. In motion control the position, velocity, force, pressure, etc., profiles are designed in such a way that the different mechanical parts work as an harmonious whole in which a perfect synchronization must be achieved. The real-time exchange of information in the distributed system that is nowadays an industrial plant plays an important role in order to achieve always better performance, better effectiveness and better safety. The network for connecting field devices such as sensors, actuators, field controllers such as PLCs, regulators, drive controller etc., and man-machine interfaces is commonly called fieldbus. Since the motion transmission is now task of the communication system, and not more of kinematic chains as in the past, the communication protocol must assure that the desired profiles, and their properties, are correctly transmitted to the axes then reproduced or else the synchronization among the different parts is lost with all the resulting consequences. In this thesis, the problem of trajectory reconstruction in the case of an event-triggered communication system is faced. The most important feature that a real-time communication system must have is the preservation of the following temporal and spatial properties: absolute temporal consistency, relative temporal consistency, spatial consistency. Starting from the basic system composed by one master and one slave and passing through systems made up by many slaves and one master or many masters and one slave, the problems in the profile reconstruction and temporal properties preservation, and subsequently the synchronization of different profiles in network adopting an event-triggered communication system, have been shown. These networks are characterized by the fact that a common knowledge of the global time is not available. Therefore they are non-deterministic networks. Each topology is analyzed and the proposed solution based on phase-locked loops adopted for the basic master-slave case has been improved to face with the other configurations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To investigate the contribution of a real-time PCR assay for the detection of Treponema pallidum in various biological specimens with the secondary objective of comparing its value according to HIV status. METHODS: Prospective cohort of incident syphilis cases from three Swiss hospitals (Geneva and Bern University Hospitals, Outpatient Clinic for Dermatology of Triemli, Zurich) diagnosed between January 2006 and September 2008. A case-control study was nested into the cohort. Biological specimens (blood, lesion swab or urine) were taken at diagnosis (as clinical information) and analysed by real-time PCR using the T pallidum 47 kDa gene. RESULTS: 126 specimens were collected from 74 patients with primary (n = 26), secondary (n = 40) and latent (n = 8) syphilis. Among primary syphilis, sensitivity was 80% in lesion swabs, 28% in whole blood, 55% in serum and 29% in urine, whereas among secondary syphilis, it was 20%, 36%, 47% and 44%, respectively. Among secondary syphilis, plasma and cerebrospinal fluid were also tested and provided a sensitivity of 100% and 50%, respectively. The global sensitivity of T pallidum by PCR (irrespective of the compartment tested) was 65% during primary, 53% during secondary and null during latent syphilis. No difference regarding serology or PCR results was observed among HIV-infected patients. Specificity was 100%. CONCLUSIONS: Syphilis PCR provides better sensitivity in lesion swabs from primary syphilis and displays only moderate sensitivity in blood from primary and secondary syphilis. HIV status did not modify the internal validity of PCR for the diagnosis of primary or secondary syphilis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a simulation model of glucose-insulin metabolism for Type 1 diabetes patients is presented. The proposed system is based on the combination of Compartmental Models (CMs) and artificial Neural Networks (NNs). This model aims at the development of an accurate system, in order to assist Type 1 diabetes patients to handle their blood glucose profile and recognize dangerous metabolic states. Data from a Type 1 diabetes patient, stored in a database, have been used as input to the hybrid system. The data contain information about measured blood glucose levels, insulin intake, and description of food intake, along with the corresponding time. The data are passed to three separate CMs, which produce estimations about (i) the effect of Short Acting (SA) insulin intake on blood insulin concentration, (ii) the effect of Intermediate Acting (IA) insulin intake on blood insulin concentration, and (iii) the effect of carbohydrate intake on blood glucose absorption from the gut. The outputs of the three CMs are passed to a Recurrent NN (RNN) in order to predict subsequent blood glucose levels. The RNN is trained with the Real Time Recurrent Learning (RTRL) algorithm. The resulted blood glucose predictions are promising for the use of the proposed model for blood glucose level estimation for Type 1 diabetes patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Personal Health Assistant Project (PHA) is a pilot system implementation sponsored by the Kozani Region Governors’ Association (KRGA) and installed in one of the two major public hospitals of the city of Kozani. PHA is intended to demonstrate how a secure, networked, multipurpose electronic health and food benefits digital signage system can transform common TV sets inside patient homes or hospital rooms into health care media players and facilitate information sharing and improve administrative efficiency among private doctors, public health care providers, informal caregivers, and nutrition program private companies, while placing individual patients firmly in control of the information at hand. This case evaluation of the PHA demonstration is intended to provide critical information to other decision makers considering implementing PHA or related digital signage technology at other institutions and public hospitals around the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Selective retina therapy (SRT) has shown great promise compared to conventional retinal laser photocoagulation as it avoids collateral damage and selectively targets the retinal pigment epithelium (RPE). Its use, however, is challenging in terms of therapy monitoring and dosage because an immediate tissue reaction is not biomicroscopically discernibel. To overcome these limitations, real-time optical coherence tomography (OCT) might be useful to monitor retinal tissue during laser application. We have thus evaluated a proprietary OCT system for its capability of mapping optical changes introduced by SRT in retinal tissue. Methods: Freshly enucleated porcine eyes, covered in DMEM upon collection were utilized and a total of 175 scans from ex-vivo porcine eyes were analyzed. The porcine eyes were used as an ex-vivo model and results compared to two time-resolved OCT scans, recorded from a patient undergoing SRT treatment (SRT Vario, Medical Laser Center Lübeck). In addition to OCT, fluorescin angiography and fundus photography were performed on the patient and OCT scans were subsequently investigated for optical tissue changes linked to laser application. Results: Biomicroscopically invisible SRT lesions were detectable in OCT by changes in the RPE / Bruch's complex both in vivo and the porcine ex-vivo model. Laser application produced clearly visible optical effects such as hyperreflectivity and tissue distortion in the treated retina. Tissue effects were even discernible in time-resolved OCT imaging when no hyper-reflectivity persisted after treatment. Data from ex-vivo porcine eyes showed similar to identical optical changes while effects visible in OCT appeared to correlate with applied pulse energy, leading to an additional reflective layer when lesions became visible in indirect ophthalmoscopy. Conclusions: Our results support the hypothesis that real-time high-resolution OCT may be a promising modality to obtain additional information about the extent of tissue damage caused by SRT treatment. Data shows that our exvivo porcine model adequately reproduces the effects occurring in-vivo, and thus can be used to further investigate this promising imaging technique.