855 resultados para structural health monitoring (SHM)
Resumo:
Objectives We studied the relationship between changes in body composition and changes in blood pressure levels. Background The mechanisms underlying the frequently observed progression from pre-hypertension to hypertension are poorly understood. Methods We examined 1,145 subjects from a population-based survey at baseline in 1994/1995 and at follow-up in 2004/2005. First, we studied individuals pre-hypertensive at baseline who, during 10 years of follow-up, either had normalized blood pressure (PreNorm, n = 48), persistently had pre-hypertension (PrePre, n = 134), or showed progression to hypertension (PreHyp, n = 183). In parallel, we studied predictors for changes in blood pressure category in individuals hypertensive at baseline (n = 429). Results After 10 years, the PreHyp group was characterized by a marked increase in body weight (+5.71% [95% confidence interval (CI): 4.60% to 6.83%]) that was largely the result of an increase in fat mass (+17.8% [95% CI: 14.5% to 21.0%]). In the PrePre group, both the increases in body weight (+1.95% [95% CI: 0.68% to 3.22%]) and fat mass (+8.09% [95% CI: 4.42% to 11.7%]) were significantly less pronounced than in the PreHyp group (p < 0.001 for both). The PreNorm group showed no significant change in body weight (-1.55% [95% CI: -3.70% to 0.61%]) and fat mass (+0.20% [95% CI: -6.13% to 6.52%], p < 0.05 for both, vs. the PrePre group). Conclusions After 10 years of follow-up, hypertension developed in 50.1% of individuals with pre-hypertension and only 6.76% went from hypertensive to pre-hypertensive blood pressure levels. An increase in body weight and fat mass was a risk factor for the development of sustained hypertension, whereas a decrease was predictive of a decrease in blood pressure. (J Am Coll Cardiol 2010; 56: 65-76) (C) 2010 by the American College of Cardiology Foundation
Resumo:
Objectives This prospective study evaluated the association of obesity and hypertension with left atrial (LA) volume over 10 years. Background Although left atrial enlargement (LAE) is an independent risk factor for atrial fibrillation, stroke, and death, little information is available about determinants of LA size in the general population. Methods Participants (1,212 men and women, age 25 to 74 years) originated from a sex-and age-stratified random sample of German residents of the Augsburg area (MONICA S3). Left atrial volume was determined by standardized echocardiography at baseline and again after 10 years. Left atrial volume was indexed to body height (iLA). Left atrial enlargement was defined as iLA >= 35.7 and >= 33.7 ml/m in men and women, respectively. Results At baseline, the prevalence of LAE was 9.8%. Both obesity and hypertension were independent predictors of LAE, obesity (odds ratio [OR]: 2.4; p < 0.001) being numerically stronger than hypertension (OR: 2.2; p < 0.001). Adjusted mean values for iLA were significantly lower in normal-weight hypertensive patients (25.4 ml/m) than in obese normotensive individuals (27.3 ml/m; p = 0.016). The highest iLA was found in the obese hypertensive subgroup (30.0 ml/m; p < 0.001 vs. all other groups). This group also presented with the highest increase in iLA (+6.0 ml/m) and the highest incidence (31.6%) of LAE upon follow-up. Conclusions In the general population, obesity appears to be the most important risk factor for LAE. Given the increasing prevalence of obesity, early interventions, especially in young obese individuals, are essential to prevent premature onset of cardiac remodeling at the atrial level. (J Am Coll Cardiol 2009; 54: 1982-9) (C) 2009 by the American College of Cardiology Foundation
Resumo:
Introduction: The ACCM/PALS guidelines address early correction of paediatric septic shock using conventional measures. In the evolution of these recommendations, indirect measures of the balance between systemic oxygen delivery and demands using central venous or superior vena cava oxygen saturation ( ScvO(2) >= 70%) in a goal-directed approach have been added. However, while these additional goal-directed endpoints are based on evidence-based adult studies, the extrapolation to the paediatric patient remains unvalidated. Objective: The purpose of this study was to compare treatment according to ACCM/PALS guidelines, performed with and without ScvO(2) goal-directed therapy, on the morbidity and mortality rate of children with severe sepsis and septic shock. Design, participants and interventions: Children and adolescents with severe sepsis or fluid-refractory septic shock were randomly assigned to ACCM/PALS with or without ScvO(2) goal-directed resuscitation. Measurements: Twenty-eight-day mortality was the primary endpoint. Results: Of the 102 enrolled patients, 51 received ACCM/PALS with ScvO(2) goal-directed therapy and 51 received ACCM/PALS without ScvO(2) goal-directed therapy. ScvO(2) goal-directed therapy resulted in less mortality ( 28-day mortality 11.8% vs. 39.2%, p = 0.002), and fewer new organ dysfunctions ( p = 0.03). ScvO(2) goal-directed therapy resulted in more crystalloid ( 28 ( 20-40) vs. 5 ( 0-20) ml/kg, p < 0.0001), blood transfusion ( 45.1% vs. 15.7%, p = 0.002) and inotropic ( 29.4% vs. 7.8%, p = 0.01) support in the first 6 h. Conclusions: This study supports the current ACCM/PALS guidelines. Goal-directed therapy using the endpoint of a ScvO(2) = 70% has a significant and additive impact on the outcome of children and adolescents with septic shock.
Resumo:
BACKGROUND: The development of newer diagnostic technologies has reduced the need for invasive electroencephalographic (EEG) studies in identifying the epileptogenic zone, especially in adult patients with mesial temporal lobe epilepsy and hippocampal sclerosis (MTLE-HS). OBJECTIVE: To evaluate ictal single photon emission computed tomography (SPECT) in the evaluation and treatment of patients with MTLE-HS. METHODS: MTLE patients were randomly assigned to those with (SPECT, n = 124) and without ictal SPECT (non-SPECT, n = 116) in an intent-to-treat protocol. Primary end points were the proportion of patients with invasive EEG studies, and those offered surgery. Secondary end points were the length of hospital stay and the proportion of patients with secondarily generalized seizures (SGS) during video-EEG, postsurgical seizure outcome, and hospital cost. RESULTS: The proportion of patients offered surgery was similar in the SPECT (85%) and non-SPECT groups (81%), as well as the proportion that had invasive EEG studies (27% vs 23%). The mean duration of hospital stay was 1 day longer for the SPECT group (P < 0.001). SGS occurred in 51% of the SPECT and 26% of the non-SPECT group (P < 0.001). The cost of the presurgical evaluation was 35% higher for the SPECT compared with the non-SPECT group (P < 0.001). The proportion of patients seizure-free after surgery was similar in the SPECT (59%) compared with non-SPECT group (54%). CONCLUSION: Ictal-SPECT did not add localizing value beyond what was provided by EEG-video telemetry and structural MRI that altered the surgical decision and outcome for MTLE-HS patients. Ictal-SPECT increased hospital stay was associated with increased costs and a higher chance of SGS during video-EEG monitoring. These findings support the notion that a protocol including ictal SPECT is equivalent to one without SPECT in the presurgical evaluation of adult patients with MTLE-HS.
Resumo:
In humans, hydromorphone (HMOR) is metabolised principally by conjugation with glucuronic acid to form hydromorphone-3-glucuronide (H3G), a close structural analogue of morphine-3-glucuronide (M3G), the major metabolite of morphine. In a previous study we described the biochemical synthesis of H3G together with a preliminary evaluation of its pharmacology which revealed that it is a neuro-excitant in rats in a manner analogous to M3G. Thus the aims of the current study were to quantify the neuro-excitatory behaviours evoked by intracerebroventricular (icv) H3G in the rat and to define its potency relative to M3G. Groups of adult male Sprague-Dawley rats received icy injections (1 muL) of H3G (1 - 3 mug), M3G (2 - 7 mug) or vehicle via a stainless steel guide cannula that had been implanted stereotaxically seven days prior to drug administration. Behavioural excitation was monitored by scoring fifteen different behaviours (myoclonic jerks, chewing, wet-dog-shakes, rearing, tonic-clonic-convulsions, explosive motor behaviour, grooming, exploring, general activity, eating, staring, ataxia, righting reflex, body posture, touch evoked agitation) immediately prior to icy injection and at the following post-dosing times: 5, 15, 25, 35, 50, 65 and 80 min. H3G produced dose-dependent behavioural excitation in a manner analogous to that reported previously for M3G by our laboratory and reproduced herein. H3G was found to be approximately 2.5-fold more potent than M3G, such that the mean (+/- S.D.) ED50 values were 2.3 (+/- 0.1) mug and 6.1 (+/- 0.6) mug respectively. Thus, our data clearly imply that if H3G crosses the BBB with equivalent efficiency to M3G, then the myoclonus, allodynia and seizures observed in some patients dosed chronically with large systemic doses of HMOR, are almost certainly due to the accumulation of sufficient H3G in the central nervous system, to evoke behavioural excitation. (C) 2001 Elsevier Science Inc. All rights reserved.
Resumo:
This paper proposes the creation of an objectively acquired reference database to more accurately characterize the incidence and longterm risk of relatively infrequent, but serious, adverse events. Such a database would be maintained longitudinally to provide for ongoing comparison with new rheumatologic drug safety databases collecting the occurrences and treatments of rare events, We propose the establishment of product-specific registries to prospectively follow a cohort of patients with rheumatoid arthritis (RA) who receive newly approved therapies. In addition, a database is required of a much larger cohort of RA patients treated with multiple second line agents of sufficient size to enable case-controlled determinations of the relative incidence of rare but serious events in the treated (registry) versus the larger disease population, The number of patients necessary for agent-specific registries and a larger patient population adequate to supply a matched case-control cohort will depend upon estimates of the detectability of an increased incidence over background. We suggest a system to carry out this proposal that will involve an umbrella organization. responsible for establishment of this large patient cohort, envisioned to be drawn from around the world.
Resumo:
The adaptations of muscle to sprint training can be separated into metabolic and morphological changes. Enzyme adaptations represent a major metabolic adaptation to sprint training, with the enzymes of all three energy systems showing signs of adaptation to training and some evidence of a return to baseline levels with detraining. Myokinase and creatine phosphokinase have shown small increases as a result of short-sprint training in some studies and elite sprinters appear better able to rapidly breakdown phosphocreatine (PCr) than the sub-elite. No changes in these enzyme levels have been reported as a result of detraining. Similarly, glycolytic enzyme activity (notably lactate dehydrogenase, phosphofructokinase and glycogen phosphorylase) has been shown to increase after training consisting of either long (> 10-second) or short (< 10-second) sprints. Evidence suggests that these enzymes return to pre-training levels after somewhere between 7 weeks and 6 months of detraining. Mitochondrial enzyme activity also increases after sprint training, particularly when long sprints or short recovery between short sprints are used as the training stimulus. Morphological adaptations to sprint training include changes in muscle fibre type, sarcoplasmic reticulum, and fibre cross-sectional area. An appropriate sprint training programme could be expected to induce a shift toward type Ha muscle, increase muscle cross-sectional area and increase the sarcoplasmic reticulum volume to aid release of Ca2+. Training volume and/or frequency of sprint training in excess of what is optimal for an individual, however, will induce a shift toward slower muscle contractile characteristics. In contrast, detraining appears to shift the contractile characteristics towards type IIb, although muscle atrophy is also likely to occur. Muscle conduction velocity appears to be a potential non-invasive method of monitoring contractile changes in response to sprint training and detraining. In summary, adaptation to sprint training is clearly dependent on the duration of sprinting, recovery between repetitions, total volume and frequency of training bouts. These variables have profound effects on the metabolic, structural and performance adaptations from a sprint-training programme and these changes take a considerable period of time to return to baseline after a period of detraining. However, the complexity of the interaction between the aforementioned variables and training adaptation combined with individual differences is clearly disruptive to the transfer of knowledge and advice from laboratory to coach to athlete.
Resumo:
The monitoring of infection control indicators including hospital-acquired infections is an established part of quality maintenance programmes in many health-care facilities. However, surveillance data use can be frustrated by the infrequent nature of many infections. Traditional methods of analysis often provide delayed identification of increasing infection occurrence, placing patients at preventable risk. The application of Shewhart, Cumulative Sum (CUSUM) and Exponentially Weighted Moving Average (EWMA) statistical process control charts to the monitoring of indicator infections allows continuous real-time assessment. The Shewhart chart will detect large changes, while CUSUM and EWMA methods are more suited to recognition of small to moderate sustained change. When used together, Shewhart and EWMA methods are ideal for monitoring bacteraemia and multiresistant organism rates. Shewhart and CUSUM charts are suitable for surgical infection surveillance.
Resumo:
Like many positive-strand RNA viruses, replication of the hepatitis C virus (HCV) is associated with cytoplasmic membrane rearrangements. However, it is unclear which HCV Proteins induce these ultrastructural features. This work examined the morphological changes induced by expression of the HCV structural proteins, core, E1 and E2, expressed from a Semliki Forest Virus (SFV) recombinant RNA replicon. Electron microscopy of cells expressing these proteins showed cytoplasmic vacuoles containing membranous and electron-dense material that were distinct from the type I cytoplasmic vacuoles induced during SFV replicon replication. Immunogold labelling showed that the core and E2 proteins localized to the external and internal membranes of these vacuoles. At times were also associated with some of the internal amorphous material. Dual immunogold labelling with antibodies raised against the core protein and against an endoplasmic reticulum (ER)-resident protein (protein disulphide isomerase) showed that the HCV-induced vacuoles were associated with ER-labelled membranes. This report has identified an association between the HCV core and E2 proteins with induced cytoplasmic vacuoles which are morphologically similar to those observed in HCV-infected liver tissue, suggesting that the HCV structural proteins may be responsible for the induction of these vacuoles during HCV replication in vivo.
Resumo:
Shiftwork is a major source of stress for many worker's. This study highlights the role that organizational and psychosocial variables play in alleviating the negative health effects of 10 and 14-h shifts. It examines the direct and mediated effects of coping strategies, social support and control of shifts on work/non-work conflict and subjective health. Participants are 60 ambulance workers, aged 22 to SS years. A structural equation model with good fit demonstrates complex effects of social support from various sources (supervisors, co-workers and family), coping and control on work/non-work conflict and subjective health., Conceptually, the research contributes to the development of a theoretical framework that can assist in explaining how key psychosocial and organizational variables influence the psychological and physical symptoms experienced by shiftworkers. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.