59 resultados para Modified early warning scores (MEWS)

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Early warning of future hypoglycemic and hyperglycemic events can improve the safety of type 1 diabetes mellitus (T1DM) patients. The aim of this study is to design and evaluate a hypoglycemia/hyperglycemia early warning system (EWS) for T1DM patients under sensor-augmented pump (SAP) therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Early warning of future hypoglycemic and hyperglycemic events can improve the safety of type 1 diabetes mellitus (T1DM) patients. The aim of this study is to design and evaluate a hypoglycemia / hyperglycemia early warning system (EWS) for T1DM patients under sensor-augmented pump (SAP) therapy. Methods: The EWS is based on the combination of data-driven online adaptive prediction models and a warning algorithm. Three modeling approaches have been investigated: (i) autoregressive (ARX) models, (ii) auto-regressive with an output correction module (cARX) models, and (iii) recurrent neural network (RNN) models. The warning algorithm performs postprocessing of the models′ outputs and issues alerts if upcoming hypoglycemic/hyperglycemic events are detected. Fusion of the cARX and RNN models, due to their complementary prediction performances, resulted in the hybrid autoregressive with an output correction module/recurrent neural network (cARN)-based EWS. Results: The EWS was evaluated on 23 T1DM patients under SAP therapy. The ARX-based system achieved hypoglycemic (hyperglycemic) event prediction with median values of accuracy of 100.0% (100.0%), detection time of 10.0 (8.0) min, and daily false alarms of 0.7 (0.5). The respective values for the cARX-based system were 100.0% (100.0%), 17.5 (14.8) min, and 1.5 (1.3) and, for the RNN-based system, were 100.0% (92.0%), 8.4 (7.0) min, and 0.1 (0.2). The hybrid cARN-based EWS presented outperforming results with 100.0% (100.0%) prediction accuracy, detection 16.7 (14.7) min in advance, and 0.8 (0.8) daily false alarms. Conclusion: Combined use of cARX and RNN models for the development of an EWS outperformed the single use of each model, achieving accurate and prompt event prediction with few false alarms, thus providing increased safety and comfort.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mapping and monitoring are believed to provide an early warning sign to determine when to stop tumor removal to avoid mechanical damage to the corticospinal tract (CST). The objective of this study was to systematically compare subcortical monopolar stimulation thresholds (1-20 mA) with direct cortical stimulation (DCS)-motor evoked potential (MEP) monitoring signal abnormalities and to correlate both with new postoperative motor deficits. The authors sought to define a mapping threshold and DCS-MEP monitoring signal changes indicating a minimal safe distance from the CST.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most cows encounter a state of negative energy balance during the periparturient period, which may lead to metabolic disorders and impaired fertility. The aim of this study was to assess the potential of milk fatty acids as diagnostic tools of detrimental levels of blood plasma nonesterified fatty acids (NEFA), defined as NEFA concentrations beyond 0.6 mmol/L, in a data set of 92 early lactating cows fed a glucogenic or lipogenic diet and subjected to 0-, 30-, or 60-d dry period before parturition. Milk was collected in wk 2, 3, 4, and 8 (n = 368) and blood was sampled weekly from wk 2 to 8 after parturition. Milk was analyzed for milk fatty acids and blood plasma for NEFA. Data were classified as "at risk of detrimental blood plasma NEFA" (NEFA ≥ 0.6 mmol/L) and "not at risk of detrimental blood plasma NEFA" (NEFA <0.6 mmol/L). Concentrations of 45 milk fatty acids and milk fat C18:1 cis-9-to-C15:0 ratio were subjected to a discriminant analysis. Milk fat C18:1 cis-9 revealed the most discriminating variable to identify detrimental blood plasma NEFA. A false positive rate of 10% allowed us to diagnose 46% of the detrimental blood plasma NEFA cases based on a milk fat C18:1 cis-9 concentration of at least 230 g/kg of milk fatty acids. Additionally, it was assessed whether the milk fat C18:1 cis-9 concentrations of wk 2 could be used as an early warning for detrimental blood plasma NEFA risk during the first 8 wk in lactation. Cows with at least 240 g/kg of C18:1 cis-9 in milk fat had about 50% chance to encounter blood plasma NEFA values of 0.6 mmol/L or more during the first 8 wk of lactation, with a false positive rate of 11.4%. Profit simulations were based on costs for cows suffering from detrimental blood plasma NEFA, and costs for preventive treatment based on daily dosing of propylene glycol for 3 wk. Given the relatively low incidence rate (8% of all observations), continuous monitoring of milk fatty acids during the first 8 wk of lactation to diagnose detrimental blood plasma NEFA does not seem cost effective. On the contrary, milk fat C18:1 cis-9 of the second lactation week could be an early warning of cows at risk of detrimental blood NEFA. In this case, selective treatment may be cost effective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Falls are common and serious problems in older adults. The goal of this study was to examine whether preclinical disability predicts incident falls in a European population of community-dwelling older adults. METHODS: Secondary data analysis was performed on a population-based longitudinal study of 1644 community-dwelling older adults living in London, U.K.; Hamburg, Germany; Solothurn, Switzerland. Data were collected at baseline and 1-year follow-up using a self-administered multidimensional health risk appraisal questionnaire, including validated questions on falls, mobility disability status (high function, preclinical disability, task difficulty), and demographic and health-related characteristics. Associations were evaluated using bivariate and multivariate logistic regression analyses. RESULTS: Overall incidence of falls was 24%, and increased by worsening mobility disability status: high function (17%), preclinical disability (32%), task difficulty (40%), test-of-trend p <.003. In multivariate analysis adjusting for other fall risk factors, preclinical disability (odds ratio [OR] = 1.7, 95% confidence interval [CI], 1.1-2.5), task difficulty (OR = 1.7, 95% CI, 1.1-2.6) and history of falls (OR = 4.7, 95% CI, 3.5-6.3) were the strongest significant predictors of falls. In stratified multivariate analyses, preclinical disability equally predicted falls in participants with (OR = 1.7, 95% CI, 1.0-3.0) and without history of falls (OR = 1.8, 95% CI, 1.1-3.0). CONCLUSIONS: This study provides longitudinal evidence that self-reported preclinical disability predicts incident falls at 1-year follow-up independent of other self-reported fall risk factors. Multidimensional geriatric assessment that includes preclinical disability may provide a unique early warning system as well as potential targets for intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The Prevention of cerebrovascular and cardiovascular Events of ischemic origin with teRutroban in patients with a history oF ischemic strOke or tRansient ischeMic attack (PERFORM) study is an international double-blind, randomized controlled trial designed to investigate the superiority of the specific TP receptor antagonist terutroban (30 mg/day) over aspirin (100 mg/day), in reducing cerebrovascular and cardiovascular events in patients with a recent history of ischemic stroke or transient ischemic attack. Here we describe the baseline characteristics of the population. METHODS AND RESULTS: Parameters recorded at baseline included vital signs, risk factors, medical history, and concomitant treatments, as well as stroke subtype, stroke-associated disability on the modified Rankin scale, and scores on scales for cognitive function and dependency. Eight hundred and two centers in 46 countries recruited a total of 19,119 patients between February 2006 and April 2008. The population is evenly distributed and is not dominated by any one country or region. The mean +/- SD age was 67.2 +/- 7.9 years, 63% were male, and 83% Caucasian; 83% had hypertension, and about half the population smoked or had quit smoking. Ninety percent of the qualifying events were ischemic stroke, 67% of which were classified as atherothrombotic or likely atherothrombotic (pure or coexisting with another cause). Modified Rankin scale scores showed slight or no disability in 83% of the population, while the scores on the Mini-Mental State Examination, Isaacs' Set Test, Zazzo's Cancellation Test, and the instrumental activities of daily living scale showed a good level of cognitive function and autonomy. CONCLUSIONS: The PERFORM study population is homogeneous in terms of demographic and disease characteristics. With 19,119 patients, the PERFORM study is powered to test the superiority of terutroban over aspirin in the secondary prevention of cerebrovascular and cardiovascular events in patients with a recent history of ischemic stroke or transient ischemic attack.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamic systems, especially in real-life applications, are often determined by inter-/intra-variability, uncertainties and time-varying components. Physiological systems are probably the most representative example in which population variability, vital signal measurement noise and uncertain dynamics render their explicit representation and optimization a rather difficult task. Systems characterized by such challenges often require the use of adaptive algorithmic solutions able to perform an iterative structural and/or parametrical update process towards optimized behavior. Adaptive optimization presents the advantages of (i) individualization through learning of basic system characteristics, (ii) ability to follow time-varying dynamics and (iii) low computational cost. In this chapter, the use of online adaptive algorithms is investigated in two basic research areas related to diabetes management: (i) real-time glucose regulation and (ii) real-time prediction of hypo-/hyperglycemia. The applicability of these methods is illustrated through the design and development of an adaptive glucose control algorithm based on reinforcement learning and optimal control and an adaptive, personalized early-warning system for the recognition and alarm generation against hypo- and hyperglycemic events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE The DRAGON score predicts functional outcome in the hyperacute phase of intravenous thrombolysis treatment of ischemic stroke patients. We aimed to validate the score in a large multicenter cohort in anterior and posterior circulation. METHODS Prospectively collected data of consecutive ischemic stroke patients who received intravenous thrombolysis in 12 stroke centers were merged (n=5471). We excluded patients lacking data necessary to calculate the score and patients with missing 3-month modified Rankin scale scores. The final cohort comprised 4519 eligible patients. We assessed the performance of the DRAGON score with area under the receiver operating characteristic curve in the whole cohort for both good (modified Rankin scale score, 0-2) and miserable (modified Rankin scale score, 5-6) outcomes. RESULTS Area under the receiver operating characteristic curve was 0.84 (0.82-0.85) for miserable outcome and 0.82 (0.80-0.83) for good outcome. Proportions of patients with good outcome were 96%, 93%, 78%, and 0% for 0 to 1, 2, 3, and 8 to 10 score points, respectively. Proportions of patients with miserable outcome were 0%, 2%, 4%, 89%, and 97% for 0 to 1, 2, 3, 8, and 9 to 10 points, respectively. When tested separately for anterior and posterior circulation, there was no difference in performance (P=0.55); areas under the receiver operating characteristic curve were 0.84 (0.83-0.86) and 0.82 (0.78-0.87), respectively. No sex-related difference in performance was observed (P=0.25). CONCLUSIONS The DRAGON score showed very good performance in the large merged cohort in both anterior and posterior circulation strokes. The DRAGON score provides rapid estimation of patient prognosis and supports clinical decision-making in the hyperacute phase of stroke care (eg, when invasive add-on strategies are considered).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To investigate the association of renal impairment on functional outcome and complications in stroke patients treated with IV thrombolysis (IVT). METHODS In this observational study, we compared the estimated glomerular filtration rate (GFR) with poor 3-month outcome (modified Rankin Scale scores 3-6), death, and symptomatic intracranial hemorrhage (sICH) based on the criteria of the European Cooperative Acute Stroke Study II trial. Unadjusted and adjusted odds ratios (ORs) with 95% confidence intervals (CIs) were calculated. Patients without IVT treatment served as a comparison group. RESULTS Among 4,780 IVT-treated patients, 1,217 (25.5%) had a low GFR (<60 mL/min/1.73 m(2)). A GFR decrease by 10 mL/min/1.73 m(2) increased the risk of poor outcome (OR [95% CI]): (ORunadjusted 1.20 [1.17-1.24]; ORadjusted 1.05 [1.01-1.09]), death (ORunadjusted 1.33 [1.28-1.38]; ORadjusted 1.18 [1.11-1.249]), and sICH (ORunadjusted 1.15 [1.01-1.22]; ORadjusted 1.11 [1.04-1.20]). Low GFR was independently associated with poor 3-month outcome (ORadjusted 1.32 [1.10-1.58]), death (ORadjusted 1.73 [1.39-2.14]), and sICH (ORadjusted 1.64 [1.21-2.23]) compared with normal GFR (60-120 mL/min/1.73 m(2)). Low GFR (ORadjusted 1.64 [1.21-2.23]) and stroke severity (ORadjusted 1.05 [1.03-1.07]) independently determined sICH. Compared with patients who did not receive IVT, treatment with IVT in patients with low GFR was associated with poor outcome (ORadjusted 1.79 [1.41-2.25]), and with favorable outcome in those with normal GFR (ORadjusted 0.77 [0.63-0.94]). CONCLUSION Renal function significantly modified outcome and complication rates in IVT-treated stroke patients. Lower GFR might be a better risk indicator for sICH than age. A decrease of GFR by 10 mL/min/1.73 m(2) seems to have a similar impact on the risk of death or sICH as a 1-point-higher NIH Stroke Scale score measuring stroke severity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

South Tyrol is a region that has been often affected by various mountain hazards such as floods, flash floods, debris flows, rock falls, and snow avalanches. Furthermore, areas located in lower altitudes are often influenced by high temperatures and heat waves. Climate change is expected to influence the frequency, magnitude, and spatial extent of these natural phenomena. For this reason, local authorities and other stakeholders are in need of tools that can enable them to reduce the risk posed by these processes. In the present study, a variety of methods are applied at local level in different places in South Tyrol that aim at: (1) the assessment of future losses caused by the occurrence of debris flows by using a vulnerability curve, (2) the assessment of social vulnerability based on the risk awareness of the exposed people to floods, and (3) the assessment of spatial exposure and social vulnerability of the exposed population to heat waves. The results show that, in South Tyrol, the risk to a number of hazards can be reduced by: (1) improving documentation for past events in order to improve existing vulnerability curves and the assessment of future losses, (2) raising citizens' awareness and responsibility to improve coping capacity to floods, and (3) extending heat wave early warning systems to more low-lying areas of South Tyrol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Flavobacterium psychrophilum is the agent of Bacterial Cold Water Disease and Rainbow Trout Fry Syndrome, two diseases leading to high mortality. Pathogen detection is mainly carried out using cultures and more rapid and sensitive methods are needed. RESULTS We describe a qPCR technique based on the single copy gene β' DNA-dependent RNA polymerase (rpoC). Its detection limit was 20 gene copies and the quantification limit 103 gene copies per reaction. Tests on spiked spleens with known concentrations of F. psychrophilum (106 to 101 cells per reaction) showed no cross-reactions between the spleen tissue and the primers and probe. Screening of water samples and spleens from symptomless and infected fishes indicated that the pathogen was already present before the outbreaks, but F. psychrophilum was only quantifiable in spleens from diseased fishes. CONCLUSIONS This qPCR can be used as a highly sensitive and specific method to detect F. psychrophilum in different sample types without the need for culturing. qPCR allows a reliable detection and quantification of F. psychrophilum in samples with low pathogen densities. Quantitative data on F. psychrophilum abundance could be useful to investigate risk factors linked to infections and also as early warning system prior to potential devastating outbreak.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SUMMARY There is interest in the potential of companion animal surveillance to provide data to improve pet health and to provide early warning of environmental hazards to people. We implemented a companion animal surveillance system in Calgary, Alberta and the surrounding communities. Informatics technologies automatically extracted electronic medical records from participating veterinary practices and identified cases of enteric syndrome in the warehoused records. The data were analysed using time-series analyses and a retrospective space-time permutation scan statistic. We identified a seasonal pattern of reports of occurrences of enteric syndromes in companion animals and four statistically significant clusters of enteric syndrome cases. The cases within each cluster were examined and information about the animals involved (species, age, sex), their vaccination history, possible exposure or risk behaviour history, information about disease severity, and the aetiological diagnosis was collected. We then assessed whether the cases within the cluster were unusual and if they represented an animal or public health threat. There was often insufficient information recorded in the medical record to characterize the clusters by aetiology or exposures. Space-time analysis of companion animal enteric syndrome cases found evidence of clustering. Collection of more epidemiologically relevant data would enhance the utility of practice-based companion animal surveillance.