137 resultados para Hilbert-Mumford criterion
Resumo:
Background: Alcohol use has beneficial as well as adverse consequences on health, but few studies examined its role in the development of age-related frailty. Objectives: To describe the cross-sectional and longitudinal association between alcohol intake and frailty in older persons. Design: The Lausanne cohort 65+ population-based study, launched in 2004. Setting: Community. Participants: One thousand five hundred sixty-four persons aged 65-70 years. Measurements: Annual data collection included demographics, health and functional status, extended by a physical examination every 3 years. Alcohol use (AUDIT-C), and Fried's frailty criteria were measured at baseline and 3-year follow-up. Participants were categorized into robust (0 frailty criterion) and vulnerable (1+ criteria). Results: Few participants (13.0%) reported no alcohol consumption over the past year, 57.8% were light-to-moderate drinkers, while 29.3% drank above recommended thresholds (18.7% "at risk" and 10.5% "heavy" drinkers). At baseline, vulnerability was most frequent in non-drinkers (43.0%), least frequent in light-to-moderate drinkers (26.2%), and amounted to 31.9% in "heavy" drinkers showing a reverse J-curve pattern. In multivariate analysis, compared to light-to-moderate drinkers, non-drinkers had twice higher odds of prevalent (adjOR: 2.24; 95%CI:1.39-3.59; p=.001), as well as 3-year incident vulnerability (adjOR: 2.00; 95%CI:1.02-3.91; p=.043). No significant association was observed among "at risk" and "heavy" drinkers. Conclusion: Non-drinkers had two-times higher odds of prevalent and 3-year incident vulnerability, even after adjusting for their baseline poorer health status. Although residual confounding is still possible, these results likely reflect a healthy survival effect among drinkers while those who experienced health- or alcohol-related problems stopped drinking earlier.
Resumo:
The screening of testosterone (T) misuse for doping control is based on the urinary steroid profile, including T, its precursors and metabolites. Modifications of individual levels and ratio between those metabolites are indicators of T misuse. In the context of screening analysis, the most discriminant criterion known to date is based on the T glucuronide (TG) to epitestosterone glucuronide (EG) ratio (TG/EG). Following the World Anti-Doping Agency (WADA) recommendations, there is suspicion of T misuse when the ratio reaches 4 or beyond. While this marker remains very sensitive and specific, it suffers from large inter-individual variability, with important influence of enzyme polymorphisms. Moreover, use of low dose or topical administration forms makes the screening of endogenous steroids difficult while the detection window no longer suits the doping habit. As reference limits are estimated on the basis of population studies, which encompass inter-individual and inter-ethnic variability, new strategies including individual threshold monitoring and alternative biomarkers were proposed to detect T misuse. The purpose of this study was to evaluate the potential of ultra-high pressure liquid chromatography (UHPLC) coupled with a new generation high resolution quadrupole time-of-flight mass spectrometer (QTOF-MS) to investigate the steroid metabolism after transdermal and oral T administration. An approach was developed to quantify 12 targeted urinary steroids as direct glucuro- and sulfo-conjugated metabolites, allowing the conservation of the phase II metabolism information, reflecting genetic and environmental influences. The UHPLC-QTOF-MS(E) platform was applied to clinical study samples from 19 healthy male volunteers, having different genotypes for the UGT2B17 enzyme responsible for the glucuroconjugation of T. Based on reference population ranges, none of the traditional markers of T misuse could detect doping after topical administration of T, while the detection window was short after oral TU ingestion. The detection ability of the 12 targeted steroids was thus evaluated by using individual thresholds following both transdermal and oral administration. Other relevant biomarkers and minor metabolites were studied for complementary information to the steroid profile, including sulfoconjugated analytes and hydroxy forms of glucuroconjugated metabolites. While sulfoconjugated steroids may provide helpful screening information for individuals with homozygotous UGT2B17 deletion, hydroxy-glucuroconjugated analytes could enhance the detection window of oral T undecanoate (TU) doping.
Resumo:
OBJECT: Reversible cerebral vasoconstriction syndrome (RCVS) is described as a clinical and radiological entity characterized by thunderclap headaches, a reversible segmental or multifocal vasoconstriction of cerebral arteries with or without focal neurological deficits or seizures. The purpose of this study is to determine risk factors of poor outcome in patients presented a RCVS. METHODS: A retrospective multi-center review of invasive and non-invasive neurovascular imaging between January 2006 and January 2011 has identified 10 patients with criterion of reversible segmental vasoconstriction syndrome. Demographics data, vascular risks and evolution of each of these patients were analyzed. RESULTS: Seven of the ten patients were females with a mean age of 46 years. In four patients, we did not found any causative factors. Two cases presented RCVS in post-partum period between their first and their third week after delivery. The other three cases were drug-induced RCVS, mainly vaso-active drugs. Cannabis was found as the causative factor in two patient, Sumatriptan identified in one patient while cyclosporine was the causative agent in also one patient. The mean duration of clinical follow-up was 10.2 months (range: 0-28 months). Two patients had neurological sequelae: one patient kept a dysphasia and the other had a homonymous lateral hemianopia. We could not find any significant difference of the evolution between secondary RCVS and idiopathic RCVS. The only two factors, which could be correlated to the clinical outcome were the neurological status at admission and the presence of intraparenchymal abnormalities (ischemic stroke, hematoma) in brain imaging. CONCLUSIONS: Fulminant vasoconstriction resulting in progressive symptoms or death has been reported in exceptional frequency. Physicians had to remember that such evolution could happen and predict them by identifying all factors of poor prognosis (neurological status at admission, the presence of intraparenchymal abnormalities).
Resumo:
Objective: To assess the effectiveness of obesity markers to detect high (>5%) 10- year risk of fatal cardiovascular disease (CVD) as estimated using the SCORE function. Methods: Cross-sectional study including 3,047 women and 2,689 men aged 35-75 years (CoLaus study). Body fat percentage was assessed by tetrapolar bioimpedance. CVD risk was assessed using the SCORE risk function and gender and age-specific cut points for body fat were derived. The diagnostic accuracy of each obesity marker was evaluated through receiver operating characteristics (ROC) analysis. Results: Body fat presented a higher correlation with 10-year CVD risk than waist/hip ratio (WHR), waist or BMI: in men, r=0.31, 0.22, 0.19 and 0.12 and for body fat, WHR, waist and BMI, respectively; the corresponding values in women were 0.18, 0.15, 0.11 and 0.05, respectively (all p<0.05). In both genders, body fat showed the highest area under the ROC curve (AUC): in men, the AUC (and 95% confidence interval) were 76.0 (73.8 - 78.2), 67.3 (64.6 - 69.9), 65.8 (63.1 - 68.5) and 60.6 (57.9 - 63.5) for body fat, WHR, waist and BMI, respectively. In women, the corresponding values were 72.3 (69.2 - 75.3), 66.6 (63.1 - 70.2), 64.1 (60.6 - 67.6) and 58.8 (55.2 - 62.4). The use of body fat percentage criterion enabled to capture three times more subjects with high CVD risk than BMI criterion, and almost twice as much as WHR criterion.. Conclusions: Obesity defined by body fat percentage is more accurate to detect high 10-year risk of fatal CVD than obesity markers based on WHR, waist or BMI.
Resumo:
Introduction: THC-COOH has been proposed as a criterion to help to distinguish between occasional from regular cannabis users. However, to date this indicator has not been adequately assessed under experimental and real-life conditions. Methods: We carried out a controlled administration study of smoked cannabis with a placebo. Twenty-three heavy smokers and 25 occasional smokers, between 18 and 30 years of age, participated in this study [Battistella G et al., PloS one. 2013;8(1):e52545]. We collected data from a second real case study performed with 146 traffic offenders' cases in which the whole blood cannabinoid concentrations and the frequency of cannabis use were known. Cannabinoid levels were determined in whole blood using tandem mass spectrometry methods. Results: Significantly high differences in THC-COOH concentrations were found between the two groups when measured during the screening visit, prior to the smoking session, and throughout the day of the experiment. Receiver operating characteristic (ROC) curves were determined and two threshold criteria were proposed in order to distinguish between these groups: a free THC-COOH concentration below 3 μg/L suggested an occasional consumption (≤ 1 joint/week) while a concentration higher than 40 μg/L corresponded to a heavy use (≥ 10 joints/month). These thresholds were successfully tested with the second real case study. The two thresholds were not challenged by the presence of ethanol (40% of cases) and of other therapeutic and illegal drugs (24%). These thresholds were also found to be consistent with previously published experimental data. Conclusion: We propose the following procedure that can be very useful in the Swiss context but also in other countries with similar traffic policies: If the whole blood THC-COOH concentration is higher than 40 μg/L, traffic offenders must be directed first and foremost toward medical assessment of their fitness to drive. This evaluation is not recommended if the THC-COOH concentration is lower than 3 μg/L. A THC-COOH level between these two thresholds can't be reliably interpreted. In such a case, further medical assessment and follow up of the fitness to drive are also suggested, but with lower priority.
Resumo:
Bacillus subtilis is the best-characterized member of the Gram-positive bacteria. Its genome of 4,214,810 base pairs comprises 4,100 protein-coding genes. Of these protein-coding genes, 53% are represented once, while a quarter of the genome corresponds to several gene families that have been greatly expanded by gene duplication, the largest family containing 77 putative ATP-binding transport proteins. In addition, a large proportion of the genetic capacity is devoted to the utilization of a variety of carbon sources, including many plant-derived molecules. The identification of five signal peptidase genes, as well as several genes for components of the secretion apparatus, is important given the capacity of Bacillus strains to secrete large amounts of industrially important enzymes. Many of the genes are involved in the synthesis of secondary metabolites, including antibiotics, that are more typically associated with Streptomyces species. The genome contains at least ten prophages or remnants of prophages, indicating that bacteriophage infection has played an important evolutionary role in horizontal gene transfer, in particular in the propagation of bacterial pathogenesis.
Resumo:
OBJECTIVES: To document the prevalence of asynchrony events during noninvasive ventilation in pressure support in infants and in children and to compare the results with neurally adjusted ventilatory assist. DESIGN: Prospective randomized cross-over study in children undergoing noninvasive ventilation. SETTING: The study was performed in a PICU. PATIENTS: From 4 weeks to 5 years. INTERVENTIONS: Two consecutive ventilation periods (pressure support and neurally adjusted ventilatory assist) were applied in random order. During pressure support (PS), three levels of expiratory trigger (ETS) setting were compared: initial ETS (PSinit), and ETS value decreased and increased by 15%. Of the three sessions, the period allowing for the lowest number of asynchrony events was defined as PSbest. Neurally adjusted ventilator assist level was adjusted to match the maximum airway pressure during PSinit. Positive end-expiratory pressure was the same during pressure support and neurally adjusted ventilator assist. Asynchrony events, trigger delay, and cycling-off delay were quantified for each period. RESULTS: Six infants and children were studied. Trigger delay was lower with neurally adjusted ventilator assist versus PSinit and PSbest (61 ms [56-79] vs 149 ms [134-180] and 146 ms [101-162]; p = 0.001 and 0.02, respectively). Inspiratory time in excess showed a trend to be shorter during pressure support versus neurally adjusted ventilator assist. Main asynchrony events during PSinit were autotriggering (4.8/min [1.7-12]), ineffective efforts (9.9/min [1.7-18]), and premature cycling (6.3/min [3.2-18.7]). Premature cycling (3.4/min [1.1-7.7]) was less frequent during PSbest versus PSinit (p = 0.059). The asynchrony index was significantly lower during PSbest versus PSinit (40% [28-65] vs 65.5% [42-76], p < 0.001). With neurally adjusted ventilator assist, all types of asynchronies except double triggering were reduced. The asynchrony index was lower with neurally adjusted ventilator assist (2.3% [0.7-5] vs PSinit and PSbest, p < 0.05 for both comparisons). CONCLUSION: Asynchrony events are frequent during noninvasive ventilation with pressure support in infants and in children despite adjusting the cycling-off criterion. Compared with pressure support, neurally adjusted ventilator assist allows improving patient-ventilator synchrony by reducing trigger delay and the number of asynchrony events. Further studies should determine the clinical impact of these findings.
Resumo:
ABSTRACT:¦BACKGROUND: The Spiritual Distress Assessment Tool (SDAT) is a 5-item instrument developed to assess unmet spiritual needs in hospitalized elderly patients and to determine the presence of spiritual distress. The objective of this study was to investigate the SDAT psychometric properties.¦METHODS: This cross-sectional study was performed in a Geriatric Rehabilitation Unit. Patients (N = 203), aged 65 years and over with Mini Mental State Exam score ≥ 20, were consecutively enrolled over a 6-month period. Data on health, functional, cognitive, affective and spiritual status were collected upon admission. Interviews using the SDAT (score from 0 to 15, higher scores indicating higher distress) were conducted by a trained chaplain. Factor analysis, measures of internal consistency (inter-item and item-to-total correlations, Cronbach α), and reliability (intra-rater and inter-rater) were performed. Criterion-related validity was assessed using the Functional Assessment of Chronic Illness Therapy-Spiritual well-being (FACIT-Sp) and the question "Are you at peace?" as criterion-standard. Concurrent and predictive validity were assessed using the Geriatric Depression Scale (GDS), occurrence of a family meeting, hospital length of stay (LOS) and destination at discharge.¦RESULTS: SDAT scores ranged from 1 to 11 (mean 5.6 ± 2.4). Overall, 65.0% (132/203) of the patients reported some spiritual distress on SDAT total score and 22.2% (45/203) reported at least one severe unmet spiritual need. A two-factor solution explained 60% of the variance. Inter-item correlations ranged from 0.11 to 0.41 (eight out of ten with P < 0.05). Item-to-total correlations ranged from 0.57 to 0.66 (all P < 0.001). Cronbach α was acceptable (0.60). Intra-rater and inter-rater reliabilities were high (Intraclass Correlation Coefficients ranging from 0.87 to 0.96). SDAT correlated significantly with the FACIT-Sp, "Are you at peace?", GDS (Rho -0.45, -0.33, and 0.43, respectively, all P < .001), and LOS (Rho 0.15, P = .03). Compared with patients showing no severely unmet spiritual need, patients with at least one severe unmet spiritual need had higher odds of occurrence of a family meeting (adjOR 4.7, 95%CI 1.4-16.3, P = .02) and were more often discharged to a nursing home (13.3% vs 3.8%; P = .027).¦CONCLUSIONS: SDAT has acceptable psychometrics properties and appears to be a valid and reliable instrument to assess spiritual distress in elderly hospitalized patients.
Resumo:
OBJECTIVE: The aim of this study was to evaluate the impact of communication skills training (CST) on working alliance and to identify specific communicational elements related to working alliance. METHODS: Pre- and post-training simulated patient interviews (6-month interval) of oncology physicians and nurses (N=56) who benefited from CST were compared to two simulated patient interviews with a 6-month interval of oncology physicians and nurses (N=57) who did not benefit from CST. The patient-clinician interaction was analyzed by means of the Roter Interaction Analysis System (RIAS). Alliance was measured by the Working Alliance Inventory - Short Revised Form. RESULTS: While working alliance did not improve with CST, generalized linear mixed effect models demonstrated that the quality of verbal communication was related to alliance. Positive talk and psychosocial counseling fostered alliance whereas negative talk, biomedical information and patient's questions diminished alliance. CONCLUSION: Patient-clinician alliance is related to specific verbal communication behaviors. PRACTICE IMPLICATIONS: Working alliance is a key element of patient-physician communication which deserves further investigation as a new marker and efficacy criterion of CST outcome.
Resumo:
Recent laboratory studies have suggested that heart rate variability (HRV) may be an appropriate criterion for training load (TL) quantification. The aim of this study was to validate a novel HRV index that may be used to assess TL in field conditions. Eleven well-trained long-distance male runners performed four exercises of different duration and intensity. TL was evaluated using Foster and Banister methods. In addition, HRV measurements were performed 5 minutes before exercise and 5 and 30 minutes after exercise. We calculated HRV index (TLHRV) based on the ratio between HRV decrease during exercise and HRV increase during recovery. HRV decrease during exercise was strongly correlated with exercise intensity (R = -0.70; p < 0.01) but not with exercise duration or training volume. TLHRV index was correlated with Foster (R = 0.61; p = 0.01) and Banister (R = 0.57; p = 0.01) methods. This study confirms that HRV changes during exercise and recovery phase are affected by both intensity and physiological impact of the exercise. Since the TLHRV formula takes into account the disturbance and the return to homeostatic balance induced by exercise, this new method provides an objective and rational TL index. However, some simplification of the protocol measurement could be envisaged for field use.
Resumo:
OBJECTIVES: The aim of the study was to statistically model the relative increased risk of cardiovascular disease (CVD) per year older in Data collection on Adverse events of anti-HIV Drugs (D:A:D) and to compare this with the relative increased risk of CVD per year older in general population risk equations. METHODS: We analysed three endpoints: myocardial infarction (MI), coronary heart disease (CHD: MI or invasive coronary procedure) and CVD (CHD or stroke). We fitted a number of parametric age effects, adjusting for known risk factors and antiretroviral therapy (ART) use. The best-fitting age effect was determined using the Akaike information criterion. We compared the ageing effect from D:A:D with that from the general population risk equations: the Framingham Heart Study, CUORE and ASSIGN risk scores. RESULTS: A total of 24 323 men were included in analyses. Crude MI, CHD and CVD event rates per 1000 person-years increased from 2.29, 3.11 and 3.65 in those aged 40-45 years to 6.53, 11.91 and 15.89 in those aged 60-65 years, respectively. The best-fitting models included inverse age for MI and age + age(2) for CHD and CVD. In D:A:D there was a slowly accelerating increased risk of CHD and CVD per year older, which appeared to be only modest yet was consistently raised compared with the risk in the general population. The relative risk of MI with age was not different between D:A:D and the general population. CONCLUSIONS: We found only limited evidence of accelerating increased risk of CVD with age in D:A:D compared with the general population. The absolute risk of CVD associated with HIV infection remains uncertain.
Resumo:
BACKGROUND: Frailty is detected by weight loss, weakness, slow walking velocity, reduced physical activity or poor endurance/exhaustion. Handwriting has not been examined in the context of frailty, despite its functional importance. OBJECTIVE: Our goal was to examine quantitative handwriting measures in people meeting 0, 1, and 2 or more (2+) frailty criteria. We also examined if handwriting parameters were associated with gait performance, weakness, poor endurance/exhaustion and cognitive impairment. METHODS: From the population-based Lc65+, 72 subjects meeting 2+ frailty criteria with complete handwriting samples were identified. Gender-matched controls meeting 1 criterion or no criteria were identified. Cognitive impairment was defined by a Mini-Mental State Examination score of 25 or less or the lowest 20th percentile of Trail Making Test Part B. Handwriting was recorded using a writing tablet and measures of velocity, pauses, and pressure were extracted. RESULTS: Subjects with 2+ criteria were older, had more health problems and need for assistance but had higher education. No handwriting parameter differed between frailty groups (age and education adjusted). Writing velocity was not significantly slower among participants from the slowest 20th percentile of gait velocity but writing pressure was significantly lower among those from the lowest 20th percentile of grip strength. Poor endurance/exhaustion was not associated with handwriting measures. Low cognitive performance was related to longer pauses. CONCLUSIONS: Handwriting parameters might be associated with specific aspects of the frailty phenotype, but not reliably with global definitions of frailty at its earliest stages among subjects able to perform handwriting tests.
Resumo:
Within Data Envelopment Analysis, several alternative models allow for an environmental adjustment. The majority of them deliver divergent results. Decision makers face the difficult task of selecting the most suitable model. This study is performed to overcome this difficulty. By doing so, it fills a research gap. First, a two-step web-based survey is conducted. It aims (1) to identify the selection criteria, (2) to prioritize and weight the selection criteria with respect to the goal of selecting the most suitable model and (3) to collect the preferences about which model is preferable to fulfil each selection criterion. Second, Analytic Hierarchy Process is used to quantify the preferences expressed in the survey. Results show that the understandability, the applicability and the acceptability of the alternative models are valid selection criteria. The selection of the most suitable model depends on the preferences of the decision makers with regards to these criteria.
Resumo:
Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.