137 resultados para Time Use


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Point-of-care (POC) tests offer potentially substantial benefits for the management of infectious diseases, mainly by shortening the time to result and by making the test available at the bedside or at remote care centres. Commercial POC tests are already widely available for the diagnosis of bacterial and viral infections and for parasitic diseases, including malaria. Infectious diseases specialists and clinical microbiologists should be aware of the indications and limitations of each rapid test, so that they can use them appropriately and correctly interpret their results. The clinical applications and performance of the most relevant and commonly used POC tests are reviewed. Some of these tests exhibit insufficient sensitivity, and should therefore be coupled to confirmatory tests when the results are negative (e.g. Streptococcus pyogenes rapid antigen detection test), whereas the results of others need to be confirmed when positive (e.g. malaria). New molecular-based tests exhibit better sensitivity and specificity than former immunochromatographic assays (e.g. Streptococcus agalactiae detection). In the coming years, further evolution of POC tests may lead to new diagnostic approaches, such as panel testing, targeting not just a single pathogen, but all possible agents suspected in a specific clinical setting. To reach this goal, the development of serology-based and/or molecular-based microarrays/multiplexed tests will be needed. The availability of modern technology and new microfluidic devices will provide clinical microbiologists with the opportunity to be back at the bedside, proposing a large variety of POC tests that will allow quicker diagnosis and improved patient care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background/Aims: Cognitive dysfunction after medical treatment is increasingly being recognized. Studies on this topic require repeated cognitive testing within a short time. However, with repeated testing, practice effects must be expected. We quantified practice effects in a demographically corrected summary score of a neuropsychological test battery repeatedly administered to healthy elderly volunteers. Methods: The Consortium to Establish a Registry for Alzheimer's Disease (CERAD) Neuropsychological Assessment Battery (for which a demographically corrected summary score was developed), phonemic fluency tests, and trail-making tests were administered in healthy volunteers aged 65 years or older on days 0, 7, and 90. This battery allows calculation of a demographically adjusted continuous summary score. Results: Significant practice effects were observed in the CERAD total score and in the word list (learning and recall) subtest. Based on these volunteer data, we developed a threshold for diagnosis of postoperative cognitive dysfunction (POCD) with the CERAD total score. Conclusion: Practice effects with repeated administration of neuropsychological tests must be accounted for in the interpretation of such tests. Ignoring practice effects may lead to an underestimation of POCD. The usefulness of the proposed demographically adjusted continuous score for cognitive function will have to be tested prospectively in patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Aging of the population in all western countries will challenge Emergency Departments (ED) as old patients visit these health services more frequently and present with special needs. The aim of this study is to describe the trend in ED visits by patients aged 85 years and over between 2005 and 2010, and to compare their service use to that of patients aged 65-84 years during this period and to investigate the evolution of these comparisons over time. METHODS: Data considered were all ED visits to the University of Lausanne Medical Center (CHUV), a tertiary Swiss teaching hospital, between 2005 and 2010 by patients aged 65 years and over (65+ years). ED visit characteristics were described according to age group and year. Incidence rates of ED visits and length of ED stay were calculated. RESULTS: Between 2005 and 2010, ED visits by patients aged 65 years and over increased by 26% overall, and by 46% among those aged 85 years and over (85+ years). Estimated ED visit incidence rate for persons aged 85+ years old was twice as high as for persons aged 65-84 years. Compared to patients aged 65-84 years, those aged 85+ years were more likely to be hospitalized and have a longer ED stay. This latter difference increased over time between 2005 and 2010. CONCLUSIONS: Oldest-old patients are increasingly using ED services. These services need to adapt their care delivery processes to meet the needs of a rising number of these complex, multimorbid and vulnerable patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological invasions and land-use changes are two major causes of the global modifications of biodiversity. Habitat suitability models are the tools of choice to predict potential distributions of invasive species. Although land-use is a key driver of alien species invasions, it is often assumed that land-use is constant in time. Here we combine historical and present day information, to evaluate whether land-use changes could explain the dynamic of invasion of the American bullfrog Rana catesbeiana (=Lithobathes catesbeianus) in Northern Italy, from the 1950s to present-day. We used maxent to build habitat suitability models, on the basis of past (1960s, 1980s) and present-day data on land-uses and species distribution. For example, we used models built using the 1960s data to predict distribution in the 1980s, and so on. Furthermore, we used land-use scenarios to project suitability in the future. Habitat suitability models predicted well the spread of bullfrogs in the subsequent temporal step. Models considering land-use changes predicted invasion dynamics better than models assuming constant land-use over the last 50 years. Scenarios of future land-use suggest that suitability will remain similar in the next years. Habitat suitability models can help to understand and predict the dynamics of invasions; however, land-use is not constant in time: land-use modifications can strongly affect invasions; furthermore, both land management and the suitability of a given land-use class may vary in time. An integration of land-use changes in studies of biological invasions can help to improve management strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: The use of tenofovir is highly associated with the emergence of mutation K65R, which confers broad resistance to nucleoside/nucleotide analogue reverse transcriptase inhibitors (NRTIs), especially when tenofovir is combined with other NRTIs also selecting for K65R. Although recent HIV-1 treatment guidelines discouraging these combinations resulted in reduced K65R selection with tenofovir, updated information on the impact of currently recommended regimens on the population selection rate of K65R is presently lacking. METHODS: In this study, we evaluated changes over time in the selection rate of resistance mutation K65R in a large population of 2736 HIV-1-infected patients failing combination antiretroviral treatment between 2002 and 2010. RESULTS: The K65R resistance mutation was detected in 144 patients, a prevalence of 5.3%. A large majority of observed K65R cases were explained by the use of tenofovir, reflecting its wide use in clinical practice. However, changing patterns over time in NRTIs accompanying tenofovir resulted in a persistent decreasing probability of K65R selection by tenofovir-based therapy. The currently recommended NRTI combination tenofovir/emtricitabine was associated with a low probability of K65R emergence. For any given dual NRTI combination including tenofovir, higher selection rates of K65R were consistently observed with a non-nucleoside reverse transcriptase inhibitor than with a protease inhibitor as the third agent. DISCUSSION: Our finding of a stable time trend of K65R despite elevated use of tenofovir illustrates increased potency of current HIV-1 therapy including tenofovir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Increasing the appropriateness of use of upper gastrointestinal (GI) endoscopy is important to improve quality of care while at the same time containing costs. This study explored whether detailed explicit appropriateness criteria significantly improve the diagnostic yield of upper GI endoscopy. METHODS: Consecutive patients referred for upper GI endoscopy at 6 centers (1 university hospital, 2 district hospitals, 3 gastroenterology practices) were prospectively included over a 6-month period. After controlling for disease presentation and patient characteristics, the relationship between the appropriateness of upper GI endoscopy, as assessed by explicit Swiss criteria developed by the RAND/UCLA panel method, and the presence of relevant endoscopic lesions was analyzed. RESULTS: A total of 2088 patients (60% outpatients, 57% men) were included. Analysis was restricted to the 1681 patients referred for diagnostic upper GI endoscopy. Forty-six percent of upper GI endoscopies were judged to be appropriate, 15% uncertain, and 39% inappropriate by the explicit criteria. No cancer was found in upper GI endoscopies judged to be inappropriate. Upper GI endoscopies judged appropriate or uncertain yielded significantly more relevant lesions (60%) than did those judged to be inappropriate (37%; odds ratio 2.6: 95% CI [2.2, 3.2]). In multivariate analyses, the diagnostic yield of upper GI endoscopy was significantly influenced by appropriateness, patient gender and age, treatment setting, and symptoms. CONCLUSIONS: Upper GI endoscopies performed for appropriate indications resulted in detecting significantly more clinically relevant lesions than did those performed for inappropriate indications. In addition, no upper GI endoscopy that resulted in a diagnosis of cancer was judged to be inappropriate. The use of such criteria improves patient selection for upper GI endoscopy and can thus contribute to efforts aimed at enhancing the quality and efficiency of care. (Gastrointest Endosc 2000;52:333-41).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Aim of post operative treatments after cardiac surgery is to avoid low cardiac output syndrome (LCOS). Levosimendan, a new inotrope agent, has been demonstrated in adult patient to be an effective treatment for this purpose when classical therapy is not effective. It shows a positive effect on cardiac output, with fewer adverse effects and lower mortality than with dopamine. There is very few data on its benefit in the paediatric population. The aim of this study is to evaluate the effect of levosimendan in cardiac children with LCOS.Methods: Retrospective analysis of 25 children hospitalised in our PICU after cardiac surgery that demonstrated LCOS not responding to classical catecholamine therapy and who received levosimendan as rescue. LCOS parameters like urine output, mixed venous oxygen saturation (SvO2), arterio-venous differences in CO2 (AVCO2) and plasmatic lactate were compared before therapy and at 12, 24, 48 and 72 hours after the beginning of the levosimendan infusion. We also analyzed the effect on the utilisation of amines (amine score), adverse events and mortality.Results: After the beginning of levosimendan infusion, urine output (3.1 vs 5.3ml/kg/h, p=0.003) and SVO2 (56 vs 64mmHg, p=0.001) increase significantly during first 72 hours and at the same time plasmatic lactate (2.6 vs 1.4 mmole/l, p<0.001), AVCO2 (11 vs 8 mmHg, p=0.002) and amine score (63 vs 39, p=0.007) decrease significantly. No side effects were noted during administration of levosimendan. In this group of patients, mortality was 0%.Conclusion: Levosimendan is an effective treatment in children after congenital heart surgery. Our study, with a greater sample of patient than other studies, confirms the improvement of cardiac output already shown in other paediatric studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fibrin sealing has recently evolved as a new technique for mesh fixation in endoscopic inguinal hernia repair. A comprehensive Medline search was carried out evaluating fibrin sealant for mesh fixation, and finally 12 studies were included (3 randomized trials, 3 nonrandomized trials, and 6 case series). The trials were assessed for operative time, seroma formation, recovery time, recurrence rate, and acute and chronic pain.There was a trend toward decreased operative times for fibrin sealing compared with mechanical stapling; however, the results for seroma formation remained contradictory. The most important finding was the reduced postoperative pain. Recovery times were lower after fibrin sealing and the recurrence rates showed no differences.Fibrin sealing for mesh fixation in the endoscopic inguinal hernia surgery is a promising alternative to mechanical stapling, which can be safely applied. As the overall quality of published data remains poor, further well-designed studies are needed until fibrin sealing can replace mechanical stapling as a new standard for mesh fixation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The value of adenovirus plasma DNA detection as an indicator for adenovirus disease is unknown in the context of T cell-replete hematopoietic cell transplantation, of which adenovirus disease is an uncommon but serious complication. METHODS: Three groups of 62 T cell-replete hematopoietic cell transplant recipients were selected and tested for adenovirus in plasma by polymerase chain reaction. RESULTS: Adenovirus was detected in 21 (87.5%) of 24 patients with proven adenovirus disease (group 1), in 4 (21%) of 19 patients who shed adenovirus (group 2), and in 1 (10.5%) of 19 uninfected control patients. The maximum viral load was significantly higher in group 1 (median maximum viral load, 6.3x10(6) copies/mL; range, 0 to 1.0x10(9) copies/mL) than in group 2 (median maximum viral load, 0 copies/mL; range, 0 to 1.7x10(8) copies/mL; P<.001) and in group 3 (median maximum viral load, 0 copies/mL; range 0-40 copies/mL; P<.001). All patients in group 2 who developed adenoviremia had symptoms compatible with adenovirus disease (i.e., possible disease). A minimal plasma viral load of 10(3) copies/mL was detected in all patients with proven or possible disease. Adenoviremia was detectable at a median of 19.5 days (range, 8-48 days) and 24 days (range, 9-41 days) before death for patients with proven and possible adenovirus disease, respectively. CONCLUSION: Sustained or high-level adenoviremia appears to be a specific and sensitive indicator of adenovirus disease after T cell-replete hematopoietic cell transplantation. In the context of low prevalence of adenovirus disease, the use of polymerase chain reaction of plasma specimens to detect virus might be a valuable tool to identify and treat patients at risk for viral invasive disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drinking motives (DM) reflect the reasons why individuals drink alcohol. Weekdays are mainly dedicated to work, whereas weekends are generally associated with spending time with friends during special events or leisure activities; using alcohol on weekdays and weekends may also be related to different DM. This study examined whether DM were differentially associated with drinking volume (DV) on weekdays and weekends. A representative sample of 5,391 young Swiss men completed a questionnaire assessing weekday and weekend DV, as well as their DM, namely, enhancement, social, coping, and conformity motives. Associations of DM with weekday and weekend DV were examined using structural equation models. Each DM was tested individually in a separate model; all associations were positive and generally stronger (except conformity) for weekend rather than for weekday DV. Further specific patterns of association were found when DM were entered into a single model simultaneously. Associations with weekday and with weekend DV were positive for enhancement and coping motives. However, associations were stronger with weekend rather than with weekday DV for enhancement, and stronger with weekday than with weekend DV for coping motives. Associations of social motives were not significant with weekend DV and negative with weekday DV. Conformity motives were negatively associated with weekend DV and positively related to weekday DV. These results suggest that interventions targeting enhancement motives should be particularly effective at decreasing weekend drinking, whereas interventions targeted at coping motives would be particularly effective at reducing alcohol use on weekdays. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: To study the prevalence of psychoactive substance use disorder (PSUD) among suicidal adolescents, psychoactive substance intoxication at the moment of the attempt, and the association between PSUD at baseline and either occurrence of suicide or repetition of suicide attempt(s). METHODS: 186 adolescents aged 16 to 21 y hospitalized for suicide attempt or overwhelming suicidal ideation were included (T0); 148 of them were traced again for evaluations after 6 mo (T1) and/or 18 mo (T2). DSM-IV diagnoses were assessed each time using the Mini International Neuropsychiatric Interview. RESULTS: At T0, 39.2% of the subjects were found to have a PSUD. Among them, a significantly higher proportion was intoxicated at the time of the attempt than those without PSUD (44.3% vs 25.4%). Among the 148 adolescents who could be traced at either T1 or T2, two died from suicide and 30 repeated suicide attempts once or more times. A marginally significant association was found between death by suicide/repetition of suicide attempt and alcohol abuse/dependence at baseline (OR=3.3, 95% CI 0.7-15.0; OR=2.6, 95% CI 0.7-9.3). More than one suicide attempt before admission to hospital at T0 (OR=3.2, 95% CI 1.1-10.0) and age over 19 y at T0 (OR=3.2, 95% CI 1.1-9.2) were independently associated with the likelihood of death by suicide or repetition of suicide attempt. CONCLUSION: Among adolescents hospitalized for suicide attempt or overwhelming suicidal ideation, the risk of death or repetition of attempt is high and is associated with previous suicide attempts--especially among older adolescents--and also marginally associated with PSUD; these adolescents should be carefully evaluated for such risks and followed up once discharged from the hospital.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: A growing number of case reports have described tenofovir (TDF)-related proximal renal tubulopathy and impaired calculated glomerular filtration rates (cGFR). We assessed TDF-associated changes in cGFR in a large observational HIV cohort. METHODS: We compared treatment-naive patients or patients with treatment interruptions > or = 12 months starting either a TDF-based combination antiretroviral therapy (cART) (n = 363) or a TDF-sparing regime (n = 715). The predefined primary endpoint was the time to a 10 ml/min reduction in cGFR, based on the Cockcroft-Gault equation, confirmed by a follow-up measurement at least 1 month later. In sensitivity analyses, secondary endpoints including calculations based on the modified diet in renal disease (MDRD) formula were considered. Endpoints were modelled using pre-specified covariates in a multiple Cox proportional hazards model. RESULTS: Two-year event-free probabilities were 0.65 (95% confidence interval [CI] 0.58-0.72) and 0.80 (95% CI 0.76-0.83) for patients starting TDF-containing or TDF-sparing cART, respectively. In the multiple Cox model, diabetes mellitus (hazard ratio [HR] = 2.34 [95% CI 1.24-4.42]), higher baseline cGFR (HR = 1.03 [95% CI 1.02-1.04] by 10 ml/min), TDF use (HR = 1.84 [95% CI 1.35-2.51]) and boosted protease inhibitor use (HR = 1.71 [95% CI 1.30-2.24]) significantly increased the risk for reaching the primary endpoint. Sensitivity analyses showed high consistency. CONCLUSION: There is consistent evidence for a significant reduction in cGFR associated with TDF use in HIV-infected patients. Our findings call for a strict monitoring of renal function in long-term TDF users with tests that distinguish between glomerular dysfunction and proximal renal tubulopathy, a known adverse effect of TDF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In response to our suggestion to define substance use disorders via 'heavy use over time', theoretical and conceptual issues, measurement problems and implications for stigma and clinical practice were raised. With respect to theoretical and conceptual issues, no other criterion has been shown, which would improve the definition. Moreover, heavy use over time is shown to be highly correlated with number of criteria in current DSM-5. Measurement of heavy use over time is simple and while there will be some underestimation or misrepresentation of actual levels in clinical practice, this is not different from the status quo and measurement of current criteria. As regards to stigma, research has shown that a truly dimensional concept can help reduce stigma. In conclusion, 'heavy use over time' as a tangible common denominator should be seriously considered as definition for substance use disorder.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Altitudinal tree lines are mainly constrained by temperature, but can also be influenced by factors such as human activity, particularly in the European Alps, where centuries of agricultural use have affected the tree-line. Over the last decades this trend has been reversed due to changing agricultural practices and land-abandonment. We aimed to combine a statistical land-abandonment model with a forest dynamics model, to take into account the combined effects of climate and human land-use on the Alpine tree-line in Switzerland. Land-abandonment probability was expressed by a logistic regression function of degree-day sum, distance from forest edge, soil stoniness, slope, proportion of employees in the secondary and tertiary sectors, proportion of commuters and proportion of full-time farms. This was implemented in the TreeMig spatio-temporal forest model. Distance from forest edge and degree-day sum vary through feed-back from the dynamics part of TreeMig and climate change scenarios, while the other variables remain constant for each grid cell over time. The new model, TreeMig-LAb, was tested on theoretical landscapes, where the variables in the land-abandonment model were varied one by one. This confirmed the strong influence of distance from forest and slope on the abandonment probability. Degree-day sum has a more complex role, with opposite influences on land-abandonment and forest growth. TreeMig-LAb was also applied to a case study area in the Upper Engadine (Swiss Alps), along with a model where abandonment probability was a constant. Two scenarios were used: natural succession only (100% probability) and a probability of abandonment based on past transition proportions in that area (2.1% per decade). The former showed new forest growing in all but the highest-altitude locations. The latter was more realistic as to numbers of newly forested cells, but their location was random and the resulting landscape heterogeneous. Using the logistic regression model gave results consistent with observed patterns of land-abandonment: existing forests expanded and gaps closed, leading to an increasingly homogeneous landscape.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Prevalence of unhealthy alcohol use among medical inpatients is high. OBJECTIVE: To characterize the course and outcomes of unhealthy alcohol use, and factors associated with these outcomes. DESIGN: Prospective cohort study. PARTICIPANTS: A total of 287 medical inpatients with unhealthy alcohol use. MAIN MEASURES: At baseline and 12 months later, consumption and alcohol-related consequences were assessed. The outcome of interest was a favorable drinking outcome at 12 months (abstinence or drinking "moderate" amounts without consequences). The independent variables evaluated included demographics, physical/sexual abuse, drug use, depressive symptoms, alcohol dependence, commitment to change (Taking Action), spending time with heavy-drinking friends and receipt of alcohol treatment (after hospitalization). Adjusted regression models were used to evaluate factors associated with a favorable outcome. KEY RESULTS: Thirty-three percent had a favorable drinking outcome 1 year later. Not spending time with heavy-drinking friends [adjusted odds ratio (AOR) 2.14, 95% CI: 1.14-4.00] and receipt of alcohol treatment [AOR (95% CI): 2.16(1.20-3.87)] were associated with a favorable outcome. Compared to the first quartile (lowest level) of Taking Action, subjects in the second, third and highest quartiles had higher odds of a favorable outcome [AOR (95% CI): 3.65 (1.47, 9.02), 3.39 (1.38, 8.31) and 6.76 (2.74, 16.67)]. CONCLUSIONS: Although most medical inpatients with unhealthy alcohol use continue drinking at-risk amounts and/or have alcohol-related consequences, one third are abstinent or drink "moderate" amounts without consequences 1 year later. Not spending time with heavy-drinking friends, receipt of alcohol treatment and commitment to change are associated with this favorable outcome. This can inform efforts to address unhealthy alcohol use among patients who often do not seek specialty treatment.