913 resultados para HIV infections Patients Nutrition


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Through clinical observation nursing staff of an inpatient rehabilitation unit identified a link between incontinence and undiagnosed urinary tract infections (UTIs). Further, clinical observation and structured continence management led to the realisation that urinary incontinence often improved, or resolved completely, after treatment with antibiotics. In 2009 a small study found that 30% of admitted rehabilitation patients had an undiagnosed UTI, with the majority admitted post-orthopaedic fracture. We suspected that the frequent use of indwelling urinary catheters (IDCs) in the orthopaedic environment may have been a contributing factor. Therefore, a second, more thorough, study was commenced in 2010 and completed in 2011. Aim The aim of this study was to identify what proportion of patients were admitted to one rehabilitation unit with an undiagnosed UTI over a 12-month period. We wanted to identify and highlight the presence of known risk factors associated with UTI and determine whether urinary incontinence was associated with the presence of UTI. Methods Data were collected from every patient that was admitted over a 12-month period (n=140). The majority of patients were over the age of 65 and had an orthopaedic fracture (36.4%) or stroke (27.1%). Mid-stream urine (MSU) samples, routinely collected and sent for culture and sensitivity as part of standard admission procedure, were used by the treating medical officer to detect the presence of UTI. A data collection sheet was developed, reviewed and trialled, before official data collection commenced. Data were collected as part of usual practice and collated by a research assistant. Inferential statistics were used to analyse the data. Results This study found that 25 (17.9%) of the 140 patients admitted to rehabilitation had an undiagnosed UTI, with a statistically significant association between prior presence of an IDC and the diagnosis of UTI. Urinary incontinence improved after the completion of treatment with antibiotics. Results further demonstrated a significant association between the confirmation of a UTI on culture and sensitivity and the absence of symptoms usually associated with UTI, such as burning or stinging on urination. Overall, this study suggests careful monitoring of urinary symptoms in patients admitted to rehabilitation, especially in patients with a prior IDC, is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

M. fortuitum is a rapidly growing mycobacterium associated with community-acquired and nosocomial wound, soft tissue, and pulmonary infections. It has been postulated that water has been the source of infection especially in the hospital setting. The aim of this study was to determine if municipal water may be the source of community-acquired or nosocomial infections in the Brisbane area. Between 2007 and 2009, 20 strains of M. fortuitum were recovered from municipal water and 53 patients’ isolates were submitted to the reference laboratory. A wide variation in strain types was identified using repetitive element sequence-based PCR, with 13 clusters of ≥2 indistinguishable isolates, and 28 patterns consisting of individual isolates. The clusters could be grouped into seven similar groups (>95% similarity). Municipal water and clinical isolates collected during the same time period and from the same geographical area consisted of different strain types, making municipal water an unlikely source of sporadic human infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Nutrition screening identifies patients at risk of malnutrition to facilitate early nutritional intervention. Studies have reported incompletion and error rates of 30-90% for a range of commonly used screening tools. This study aims to investigate the incompletion and error rates of 3-Minute Nutrition Screening (3-MinNS) and the effect of quality improvement initiatives in improving the overall performance of the screening tool and the referral process for at risk patients. Methods Annual audits were carried out from 2008-2013 on 4467 patients. Value Stream Mapping, Plan-Do-Check-Act cycle and Root Cause Analysis were used in this study to identify gaps and determine the best intervention. The intervention included 1) implementing a nutrition screening protocol, 2) nutrition screening training, 3) nurse empowerment for online dietetics referral of at-risk cases, 4) closed-loop feedback system and 5) removing a component of 3-MinNS that caused the most error without compromising its sensitivity and specificity. Results Nutrition screening error rates were 33% and 31%, with 5% and 8% blank or missing forms, in 2008 and 2009 respectively. For patients at risk of malnutrition, referral to dietetics took up to 7.5 days, with 10% not referred at all. After intervention, the latter decreased to 7% (2010), 4% (2011) and 3% (2012 and 2013), and the mean turnaround time from screening to referral was reduced significantly from 4.3 ± 1.8 days to 0.3 ± 0.4 days (p < 0.001). Error rates were reduced to 25% (2010), 15% (2011), 7% (2012) and 5% (2013) and percentage of blank or missing forms reduced to and remained at 1%. Conclusion Quality improvement initiatives are effective in reducing the incompletion and error rates of nutrition screening, and led to sustainable improvements in the referral process of patients at nutritional risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information on foods patients like and dislike is the essential basis for planning menus which are acceptable to patients and promote adequate consumption. The aim of this study was to obtain quantitative data on the food preferences of inpatients at a large metropolitan public hospital for use in menu planning. Methodology was based on a study by Williams et al (1988), and included additional questions about appetite and taste changes. The survey used a 9 point hedonic scale to rate foods listed in random order and was modified to incorporate more contemporary foods than those used in the originalWilliams study. Surveys were conducted by final year University of Queensland dietetics students on Food Service Practicum at the Royal Brisbane and Women’s Hospital (929 beds) in 2012. The first survey (220 questions, n = 157) had a response rate of 61%. The second included more sandwich fillings and salads (231 questions, n = 219, response rate 67%). Total number surveyed was 376. Results showed the most preferred foods were roast potato, grilled steak, ice cream, fresh strawberries, roast lamb, roast beef, grapes and banana. The least preferred foods were grapefruit, soybeans, lentils, sardines, prune juice and grapefruit juice. Patients who reported taste changes (10%) had similar food preferences to those who didn’t report taste changes. Patients who reported poor/very poor appetite (10%) generally scored foods lower than those who reported OK (22%), good/very good appetite (65%). The results of this study informed planning for a new patient menu at the RBWH in December 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Sexually-transmitted pathogens often have severe reproductive health implications if treatment is delayed or absent, especially in females. The complex processes of disease progression, namely replication and ascension of the infection through the genital tract, span both extracellular and intracellular physiological scales, and in females can vary over the distinct phases of the menstrual cycle. The complexity of these processes, coupled with the common impossibility of obtaining comprehensive and sequential clinical data from individual human patients, makes mathematical and computational modelling valuable tools in developing our understanding of the infection, with a view to identifying new interventions. While many within-host models of sexually-transmitted infections (STIs) are available in existing literature, these models are difficult to deploy in clinical/experimental settings since simulations often require complex computational approaches. Results We present STI-GMaS (Sexually-Transmitted Infections – Graphical Modelling and Simulation), an environment for simulation of STI models, with a view to stimulating the uptake of these models within the laboratory or clinic. The software currently focuses upon the representative case-study of Chlamydia trachomatis, the most common sexually-transmitted bacterial pathogen of humans. Here, we demonstrate the use of a hybrid PDE–cellular automata model for simulation of a hypothetical Chlamydia vaccination, demonstrating the effect of a vaccine-induced antibody in preventing the infection from ascending to above the cervix. This example illustrates the ease with which existing models can be adapted to describe new studies, and its careful parameterisation within STI-GMaS facilitates future tuning to experimental data as they arise. Conclusions STI-GMaS represents the first software designed explicitly for in-silico simulation of STI models by non-theoreticians, thus presenting a novel route to bridging the gap between computational and clinical/experimental disciplines. With the propensity for model reuse and extension, there is much scope within STI-GMaS to allow clinical and experimental studies to inform model inputs and drive future model development. Many of the modelling paradigms and software design principles deployed to date transfer readily to other STIs, both bacterial and viral; forthcoming releases of STI-GMaS will extend the software to incorporate a more diverse range of infections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Research has identified associations between serum 25(OH)D and a range of clinical outcomes in chronic kidney disease and wider populations. The present study aimed to investigate vitamin D deficiency/insufficiency in dialysis patients and the relationship with vitamin D intake and sun exposure. Methods A cross-sectional study was used. Participants included 30 peritoneal dialysis (PD) (43.3% male; 56.87 ± 16.16 years) and 26 haemodialysis (HD) (80.8% male; 63.58 ± 15.09 years) patients attending a department of renal medicine. Explanatory variables were usual vitamin D intake from diet/supplements (IU day−1) and sun exposure (min day−1). Vitamin D intake, sun exposure and ethnic background were assessed by questionnaire. Weight, malnutrition status and routine biochemistry were also assessed. Data were collected during usual department visits. The main outcome measure was serum 25(OH)D (nm). Results Prevalence of inadequate/insufficient vitamin D intake differed between dialysis modality, with 31% and 43% found to be insufficient (<50 nm) and 4% and 33% found to be deficient (<25 nm) in HD and PD patients, respectively (P < 0.001). In HD patients, there was a correlation between diet and supplemental vitamin D intake and 25(OH)D (ρ = 0.84, P < 0.001) and average sun exposure and 25(OH)D (ρ = 0.50, P < 0.02). There were no associations in PD patients. The results remained significant for vitamin D intake after multiple regression, adjusting for age, gender and sun exposure. Conclusions The results highlight a strong association between vitamin D intake and 25(OH)D in HD but not PD patients, with implications for replacement recommendations. The findings indicate that, even in a sunny climate, many dialysis patients are vitamin D deficient, highlighting the need for exploration of determinants and consequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Many countries are scaling up malaria interventions towards elimination. This transition changes demands on malaria diagnostics from diagnosing ill patients to detecting parasites in all carriers including asymptomatic infections and infections with low parasite densities. Detection methods suitable to local malaria epidemiology must be selected prior to transitioning a malaria control programme to elimination. A baseline malaria survey conducted in Temotu Province, Solomon Islands in late 2008, as the first step in a provincial malaria elimination programme, provided malaria epidemiology data and an opportunity to assess how well different diagnostic methods performed in this setting. Methods During the survey, 9,491 blood samples were collected and examined by microscopy for Plasmodium species and density, with a subset also examined by polymerase chain reaction (PCR) and rapid diagnostic tests (RDTs). The performances of these diagnostic methods were compared. Results A total of 256 samples were positive by microscopy, giving a point prevalence of 2.7%. The species distribution was 17.5% Plasmodium falciparum and 82.4% Plasmodium vivax. In this low transmission setting, only 17.8% of the P. falciparum and 2.9% of P. vivax infected subjects were febrile (≥38°C) at the time of the survey. A significant proportion of infections detected by microscopy, 40% and 65.6% for P. falciparum and P. vivax respectively, had parasite density below 100/μL. There was an age correlation for the proportion of parasite density below 100/μL for P. vivax infections, but not for P. falciparum infections. PCR detected substantially more infections than microscopy (point prevalence of 8.71%), indicating a large number of subjects had sub-microscopic parasitemia. The concordance between PCR and microscopy in detecting single species was greater for P. vivax (135/162) compared to P. falciparum (36/118). The malaria RDT detected the 12 microscopy and PCR positive P. falciparum, but failed to detect 12/13 microscopy and PCR positive P. vivax infections. Conclusion Asymptomatic malaria infections and infections with low and sub-microscopic parasite densities are highly prevalent in Temotu province where malaria transmission is low. This presents a challenge for elimination since the large proportion of the parasite reservoir will not be detected by standard active and passive case detection. Therefore effective mass screening and treatment campaigns will most likely need more sensitive assays such as a field deployable molecular based assay.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In his letter Cunha suggests that oral antibiotic therapy is safer and less expensive than intravenous therapy via central venous catheters (CVCs) (1). The implication is that costs will fall and increased health benefits will be enjoyed resulting in a gain in efficiency within the healthcare system. CVCs are often used in critically ill patients to deliver antimicrobial therapy, but expose patients to a risk of catheter-related bloodstream infection (CRBSI). Our current knowledge about the efficiency (i.e. costeffectiveness) of allocating resources toward interventions that prevent CRBSI in patients requiring a CVC has already been reviewed (2). If for some patient groups antimicrobial therapy can be delivered orally, instead of through a CVC, then the costs and benefits of this alternate strategy should be evaluated...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal nutrition across the continuum of care plays a key role in the short- and long-term clinical and economic outcomes of patients. Worldwide, an estimated one-quarter to one-half of patients admitted to hospitals each year are malnourished. Malnutrition can increase healthcare costs by delaying patient recovery and rehabilitation and increasing the risk of medical complications. Nutrition interventions have the potential to provide cost-effective preventive care and treatment measures. However, limited data exist on the economics and impact evaluations of these interventions. In this report, nutrition and health system researchers, clinicians, economists, and policymakers discuss emerging global research on nutrition health economics, the role of nutrition interventions across the continuum of care, and how nutrition can affect healthcare costs in the context of hospital malnutrition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background To determine the impact of cataract surgery on vision-related quality of life (VRQOL) and examine the association between objective visual measures and change in VRQOL after surgery among bilateral cataract patients in Ho Chi Minh City, Vietnam. Methods A cohort of older patients with bilateral cataract was assessed one week before and one to three months after first eye or both eye cataract surgery. Visual measures including visual acuity, contrast sensitivity and stereopsis were obtained. Vision-related quality of life was assessed using the NEI VFQ-25. Descriptive analyses and a generalized linear estimating equation (GEE) analysis were undertaken to measure change in VRQOL after surgery. Results Four hundred and thirteen patients were assessed before cataract surgery and 247 completed the follow-up assessment one to three months after first or both eye cataract surgery. Overall, VRQOL significantly improved after cataract surgery (p < 0.001) particularly after both eye surgeries. Binocular contrast sensitivity (p < 0.001) and stereopsis (p < 0.001) were also associated with change in VRQOL after cataract surgery. Visual acuity was not associated with VRQOL. Conclusions Cataract surgery significantly improved VRQOL among bilateral cataract patients in Vietnam. Contrast sensitivity as well as stereopsis, rather than visual acuity significantly affected VRQOL after cataract surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose To determine the prevalence of falls in the 12 months prior to cataract surgery and examine the associations between visual and other risk factors and falls among older bilateral cataract patients in Vietnam. Methods Data collected from 413 patients in the week before scheduled cataract surgery included a questionnaire and three objective visual tests. Results The outcome of interest was self-reported falls in the previous 12 months. A total of 13% (n = 53) of bilateral cataract patients reported 60 falls within the previous 12 months. After adjusting for age, sex, race, employment status, comorbidities, medication usage, refractive management, living status and the three objective visual tests in the worse eye, women (odds ratio, OR, 4.64, 95% confidence interval, CI, 1.85–11.66), and those who lived alone (OR 4.51, 95% CI 1.44–14.14) were at increased risk of a fall. Those who reported a comorbidity were at decreased risk of a fall (OR 0.43, 95% CI 0.19–0.95). Contrast sensitivity (OR 0.31, 95% CI 0.10–0.95) was the only significant visual test associated with a fall. These results were similar for the better eye, except the presence of a comorbidity was not significant (OR 0.45, 95% CI 0.20–1.02). Again, contrast sensitivity was the only significant visual factor associated with a fall (OR 0.15, 95% CI 0.04–0.53). Conclusion Bilateral cataract patients in Vietnam are potentially at high risk of falls and in need of falls prevention interventions. It may also be important for ophthalmologists and health professionals to consider contrast sensitivity measures when prioritizing cataract patients for surgery and assessing their risk of falls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prevalence of protein-energy malnutrition (PEM), food intake inadequacy and associated health-related outcomes in morbidly obese (Body Mass Index ≥ 40 kg/m2) acute care patients are unknown. This study reports findings in morbidly obese participants from the Australasian Nutrition Care Day Survey (ANCDS) conducted in 2010. The ANCDS was a cross-sectional survey involving acute care patients from 56 Australian and New Zealand hospitals. Hospital-based dietitians evaluated participants’ nutritional status (defined by Subjective Global Assessment, SGA) and 24-hour food intake (as 0%, 25%, 50%, 75%, and 100% of the offered food). Three months later, outcome data, including length of stay (LOS) and 90-day in-hospital mortality, were collected. Of the 3122 participants, 4% (n = 136) were morbidly obese (67% females, 55 ± 14 years, BMI: 48 ± 8 kg/m2). Eleven percent (n = 15) of the morbidly obese patients were malnourished, and most (n = 11/15, 73%)received standard hospital diets without additional nutritional support. Malnourished morbidly obese patients had significantly longer LOS and greater 90-day in-hospital mortality than well-nourished counterparts (23 days vs. 9 days, p = 0.036; 14% vs. 0% mortality, p = 0.011 respectively). Thirteen morbidly obese patients (10%) consumed only 25% of the offered meals with a significantly greater proportion of malnourished (n = 4, 27%) versus well-nourished (n = 9, 7%) (p = 0.018). These results provide new knowledge on the prevalence of PEM and poor food intake in morbidly obese patients in Australian and New Zealand hospitals. For the first time internationally, the study establishes that PEM is significantly associated with negative outcomes in morbidly obese patients and warrants timely nutritional support during hospitalisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent data highlighted the association between penetration of antiretrovirals in the central nervous system (CNS) and neurocognitive impairment in HIVpositive patients. Existing antiretrovirals have been ranked according to a score of neuropenetration, which was shown to be a predictor of anti-HIVactivity in the CNS and improvement of neurocognitive disorders [1]. Main factors affecting drug penetration are known to be protein binding, lipophilicity and molecular weight [2]. Moreover, active translation by membrane transporters (such as p-glycoprotein) could be a key mechanism of passage [3]. The use of raltegravir (RGV), a novel antiretroviral drug targeted to inhibit the HIV preintegrase complex, is increasing worldwide due to its efficacy and tolerability. However, penetration of RGV in the CNS has not been yet elucidated. In fact, prediction of RGV neuropenetration according to molecular characteristics is controversial. Intermediate protein binding (83%) and large volume of distribution (273 l) could suggest a high distribution beyond extracellular spaces [4]. On the contrary, low lipophilicity (oil/water partition coefficient at pH 7.4 of 2.80) and intermediate molecular weight (482.51 Da) suggest a limited diffusion. Furthermore, in-vitro studies suggest that RGV is substrate of p-glycoprotein, although this efflux pump has not been identified to significantly affect plasma pharmacokinetics [5]. In any case, no data concerning RGV passage into cerebrospinal fluid of animals or humans have yet been published.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Pediatric nutrition risk screening tools are not routinely implemented throughout many hospitals, despite prevalence studies demonstrating malnutrition is common in hospitalized children. Existing tools lack the simplicity of those used to assess nutrition risk in the adult population. This study reports the accuracy of a new, quick, and simple pediatric nutrition screening tool (PNST) designed to be used for pediatric inpatients. Materials and Methods: The pediatric Subjective Global Nutrition Assessment (SGNA) and anthropometric measures were used to develop and assess the validity of 4 simple nutrition screening questions comprising the PNST. Participants were pediatric inpatients in 2 tertiary pediatric hospitals and 1 regional hospital. Results: Two affirmative answers to the PNST questions were found to maximize the specificity and sensitivity to the pediatric SGNA and body mass index (BMI) z scores for malnutrition in 295 patients. The PNST identified 37.6% of patients as being at nutrition risk, whereas the pediatric SGNA identified 34.2%. The sensitivity and specificity of the PNST compared with the pediatric SGNA were 77.8% and 82.1%, respectively. The sensitivity of the PNST at detecting patients with a BMI z score of less than -2 was 89.3%, and the specificity was 66.2%. Both the PNST and pediatric SGNA were relatively poor at detecting patients who were stunted or overweight, with the sensitivity and specificity being less than 69%. Conclusion: The PNST provides a sensitive, valid, and simpler alternative to existing pediatric nutrition screening tools such as Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), Screening Tool Risk on Nutritional status and Growth (STRONGkids), and Paediatric Yorkhill Malnutrition Score (PYMS) to ensure the early detection of hospitalized children at nutrition risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: It is important to identify patients who are at risk of malnutrition upon hospital admission as malnutrition results in poor outcomes such as longer length of hospital stay, readmission, hospitalisation cost and mortality. The aim of this study was to determine the prognostic validity of 3-Minute Nutrition Screening (3-MinNS) in predicting hospital outcomes in patients admitted to an acute tertiary hospital through a list of diagnosis-related groups (DRG). Methods: In this study, 818 adult patients were screened for risk of malnutrition using 3-MinNS within 24 hours of admission. Mortality data was collected from the National Registry with other hospitalisation outcomes retrieved from electronic hospital records. The results were adjusted for age, gender and ethnicity, and matched for DRG. Results: Patients identified to be at risk of malnutrition (37%) using 3-MinNS had significant positive association with longer length of hospital stay (6.6 ± 7.1 days vs. 4.5 ± 5.5 days, p<0.001), higher hospitalisation cost (S$4540 ± 7190 vs. S$3630 ± 4961, p<0.001) and increased mortality rate at 1 year (27.8% vs. 3.9%), 2 years (33.8% vs. 7.2%) and 3 years (39.1% vs. 10.5%); p<0.001 for all. Conclusions: The 3-MinNS is able to predict clinical outcomes and can be used to screen newly admitted patients for nutrition risk so that appropriate nutrition assessment and early nutritional intervention can be initiated.