998 resultados para BEL13-007
Resumo:
BACKGROUND: Malnutrition, and poor intake during hospitalisation, are common in older medical patients. Better understanding of patient-specific factors associated with poor intake may inform nutritional interventions. AIMS: To measure the proportion of older medical patients with inadequate nutritional intake, and identify patient-related factors associated with this outcome. METHODS: Prospective cohort study enrolling consecutive consenting medical inpatients aged 65 years or older. Primary outcome was energy intake less than resting energy expenditure estimated using weight-based equations. Energy intake was calculated for a single day using direct observation of plate waste. Explanatory variables included age, gender, number of co-morbidities, number of medications, diagnosis, usual residence, nutritional status, functional and cognitive impairment, depressive symptoms, poor appetite, poor dentition, and dysphagia. RESULTS: Of 134 participants (mean age 80 years, 51% female), only 41% met estimated resting energy requirements. Mean energy intake was 1220 kcal/day (SD 440), or 18.1 kcal/kg/day. Factors associated with inadequate energy intake in multivariate analysis were poor appetite, higher BMI, diagnosis of infection or cancer, delirium and need for assistance with feeding. CONCLUSIONS: Inadequate nutritional intake is common, and patient factors contributing to poor intake need to be considered in nutritional interventions.
Resumo:
Objective Factors associated with the development of hallux valgus (HV) are multifactorial and remain unclear. The objective of this systematic review and meta-analysis was to investigate characteristics of foot structure and footwear associated with HV. Design Electronic databases (Medline, Embase, and CINAHL) were searched to December 2010. Cross-sectional studies with a valid definition of HV and a non-HV comparison group were included. Two independent investigators quality rated all included papers. Effect sizes and 95% confidence intervals (CIs) were calculated (standardized mean differences (SMDs) for continuous data and risk ratios (RRs) for dichotomous data). Where studies were homogeneous, pooling of SMDs was conducted using random effects models. Results A total of 37 papers (34 unique studies) were quality rated. After exclusion of studies without reported measurement reliability for associated factors, data were extracted and analysed from 16 studies reporting results for 45 different factors. Significant factors included: greater first intermetatarsal angle (pooled SMD = 1.5, CI: 0.88–2.1), longer first metatarsal (pooled SMD = 1.0, CI: 0.48–1.6), round first metatarsal head (RR: 3.1–5.4), and lateral sesamoid displacement (RR: 5.1–5.5). Results for clinical factors (e.g., first ray mobility, pes planus, footwear) were less conclusive regarding their association with HV. Conclusions Although conclusions regarding causality cannot be made from cross-sectional studies, this systematic review highlights important factors to monitor in HV assessment and management. Further studies with rigorous methodology are warranted to investigate clinical factors associated with HV.
Resumo:
Objective Although several validated nutritional screening tools have been developed to “triage” inpatients for malnutrition diagnosis and intervention, there continues to be debate in the literature as to which tool/tools clinicians should use in practice. This study compared the accuracy of seven validated screening tools in older medical inpatients against two validated nutritional assessment methods. Methods This was a prospective cohort study of medical inpatients at least 65 y old. Malnutrition screening was conducted using seven tools recommended in evidence-based guidelines. Nutritional status was assessed by an accredited practicing dietitian using the Subjective Global Assessment (SGA) and the Mini-Nutritional Assessment (MNA). Energy intake was observed on a single day during first week of hospitalization. Results In this sample of 134 participants (80 ± 8 y old, 50% women), there was fair agreement between the SGA and MNA (κ = 0.53), with MNA identifying more “at-risk” patients and the SGA better identifying existing malnutrition. Most tools were accurate in identifying patients with malnutrition as determined by the SGA, in particular the Malnutrition Screening Tool and the Nutritional Risk Screening 2002. The MNA Short Form was most accurate at identifying nutritional risk according to the MNA. No tool accurately predicted patients with inadequate energy intake in the hospital. Conclusion Because all tools generally performed well, clinicians should consider choosing a screening tool that best aligns with their chosen nutritional assessment and is easiest to implement in practice. This study confirmed the importance of rescreening and monitoring food intake to allow the early identification and prevention of nutritional decline in patients with a poor intake during hospitalization.
Resumo:
Long undersea debris runout can be facilitated by a boundary layer formed by weak marine sediments under a moving slide mass. Undrained loading of such offshore sediment results in a profound drop of basal shear resistance, compared to subaerial shear resistance, enabling long undersea runout. Thus large long-runout submarine landslides are not truly enigmatic (Voight and Elsworth 1992, 1997), but are understandable in terms of conventional geotechnical principles. A corollary is that remoulded undrained strength, and not friction angle, should be used for basal resistance in numerical simulations. This hypothesis is testable via drilling and examining the structure at the soles of undersea debris avalanches for indications of incorporation of sheared marine sediments, by tests of soil properties, and by simulations. Such considerations of emplacement process are an aim of ongoing research in the Lesser Antilles (Caribbean Sea), where multiple offshore debris avalanche and dome-collapse debris deposits have been identified since 1999 on swath bathymetric surveys collected in five oceanographic cruises. This paper reviews the prehistoric and historic collapses that have occurred offshore of Antilles arc islands and summarizes ongoing research on emplacement processes.
Resumo:
Background/Aims: The aim of this study was to investigate the colonization of mutans streptococci (MS) and lactobacilli (LB) in predentate children from the neonatal period to 7 months. Methods: A total of 957 mother-and-child pairs were recruited from birth and followed up at 7 months. The 283 children who did not have erupted teeth at the second visit were included in the study. Oral mucosal swabs were taken, and the presence of MS and LB was determined using a commercial microbiological culture kit. Results: At mean ages of 34 days and 7 months, 9 and 11% of the infants, respectively, showed the presence of MS. In contrast, LB presence increased from 24 to 47% (p < 0.0001). MS presence in the neonatal period was associated with maternal MS counts of >105 CFU/ml (p = 0.05), while LB presence was associated with natural birth (p = 0.03) and maternal LB presence (p = 0.02). At 7 months, MS presence was associated with maternal MS counts (p = 0.02) and LB counts of >105 CFU/ml (p = 0.007). Additional predictors of MS presence at 7 months were a child’s MS counts of >105 CFU/ml at the neonatal visit (p = 0.019) and nighttime bottle feeding (p = 0.024). LB presence at 7 months was associated with maternal LB (p < 0.001) and MS presence (p = 0.02). Conclusions: MS and LB can be detected by culture in the oral cavity as early as 34 days after birth. Their infection rates increase to 11 and 47%, respectively, by the time the children reach the end of the predentate stage of oral development.
Resumo:
Gypsum plasterboards are commonly used as a fire safety material in the building industry. Many research studies have been undertaken to investigate the thermal behaviour of plasterboards under standard fire conditions. However, there are many discrepancies in relation to the basic thermal properties of plasterboards while simple equations are not available to predict the ambient surface time–temperature profiles of gypsum plasterboard panels that can be used in simulating the behaviour and strength of steel studs or joists in load bearing LSF wall and floor systems. In this research, suitable thermal properties of plasterboards were proposed based on a series of tests and available results from past research. Finite element models of gypsum plasterboard panels were then developed to simulate their thermal behaviour under standard fire conditions. The accuracy of the proposed thermal properties and the finite element models was validated by comparing the numerical results with available fire test results of plasterboard panels. This paper presents the details of the finite element models of plasterboard panels, the thermal analysis results from finite element analyses under standard fire conditions and their comparisons with experimental results
Resumo:
Objective: To use our Bayesian method of motor unit number estimation (MUNE) to evaluate lower motor neuron degeneration in ALS. Methods: In subjects with ALS we performed serial MUNE studies. We examined the repeatability of the test and then determined whether the loss of MUs was fitted by an exponential or Weibull distribution. Results: The decline in motor unit (MU) numbers was well-fitted by an exponential decay curve. We calculated the half life of MUs in the abductor digiti minimi (ADM), abductor pollicis brevis (APB) and/or extensor digitorum brevis (EDB) muscles. The mean half life of the MUs of ADM muscle was greater than those of the APB or EDB muscles. The half-life of MUs was less in the ADM muscle of subjects with upper limb than in those with lower limb onset. Conclusions: The rate of loss of lower motor neurons in ALS is exponential, the motor units of the APB decay more quickly than those of the ADM muscle and the rate of loss of motor units is greater at the site of onset of disease. Significance: This shows that the Bayesian MUNE method is useful in following the course and exploring the clinical features of ALS. 2012 International Federation of Clinical Neurophysiology.
Resumo:
Background: Tenofovir has been associated with renal phosphate wasting, reduced bone mineral density, and higher parathyroid hormone levels. The aim of this study was to carry out a detailed comparison of the effects of tenofovir versus non-tenofovir use on calcium, phosphate and, vitamin D, parathyroid hormone (PTH), and bone mineral density. Methods: A cohort study of 56 HIV-1 infected adults at a single centre in the UK on stable antiretroviral regimes comparing biochemical and bone mineral density parameters between patients receiving either tenofovir or another nucleoside reverse transcriptase inhibitor. Principal Findings: In the unadjusted analysis, there was no significant difference between the two groups in PTH levels (tenofovir mean 5.9 pmol/L, 95% confidence intervals 5.0 to 6.8, versus non-tenofovir; 5.9, 4.9 to 6.9; p = 0.98). Patients on tenofovir had significantly reduced urinary calcium excretion (median 3.01 mmol/24 hours) compared to non-tenofovir users (4.56; p,0.0001). Stratification of the analysis by age and ethnicity revealed that non-white men but not women, on tenofovir had higher PTH levels than non-white men not on tenofovir (mean difference 3.1 pmol/L, 95% CI 5.3 to 0.9; p = 0.007). Those patients with optimal 25-hydroxyvitamin D (.75 nmol/L) on tenofovir had higher 1,25-dihydroxyvitamin D [1,25(OH)2D] (median 48 pg/mL versus 31; p = 0.012), fractional excretion of phosphate (median 26.1%, versus 14.6;p = 0.025) and lower serum phosphate (median 0.79 mmol/L versus 1.02; p = 0.040) than those not taking tenofovir. Conclusions: The effects of tenofovir on PTH levels were modified by sex and ethnicity in this cohort. Vitamin D status also modified the effects of tenofovir on serum concentrations of 1,25(OH)2D and phosphate.
Resumo:
This paper investigates the effects of lane-changing in driver behavior by measuring (i) the induced transient behavior and (ii) the change in driver characteristics, i.e., changes in driver response time and minimum spacing. We find that the transition largely consists of a pre-insertion transition and a relaxation process. These two processes are different but can be reasonably captured with a single model. The findings also suggest that lane-changing induces a regressive effect on driver characteristics: a timid driver (characterized by larger response time and minimum spacing) tends to become less timid and an aggressive driver less aggressive. We offer an extension to Newell’s car-following model to describe this regressive effect and verify it using vehicle trajectory data.
Resumo:
In a globalised world, it makes sense to examine our demands on the landscape through the wide-angle lens of ecological footprint analysis. However, the important impetus towards a more localised societal system suggests a review of this approach and a return to its origins in carrying capacity assessment. The determination of whether we live within or beyond our carrying capacity is entirely scalar, with national, regional and local assessments dependent not only on the choices of the population but the capability of a landscape - at scale. The Carrying Capacity Dashboard, an openly accessible online modelling interface, has been developed for Australian conditions, facilitating analysis at various scales. Like ecological footprint analysis it allows users to test a variety of societal behaviours such as diet, consumption patterns, farming systems and ecological protection practices; but unlike the footprint approach, the results are uniquely tailored to place. This paper examines population estimates generated by the Carrying Capacity Dashboard. It compares results in various scales of analysis, from national to local. It examines the key behavioural choices influencing Australian carrying capacity estimates. For instance, the assumption that the consumption of red meat automatically lowers carrying capacity is examined and in some cases, debunked. Lastly, it examines the implications of implementing carrying capacity assessment globally, but not through a wide angle lens; rather, by examining the landscape one locality at a time.
Resumo:
INTRODUCTION: Little research has examined recognized pregnancy losses in a general population. Data from an Australian cohort study provide an opportunity to quantify this aspect of fecundity at a population level. METHOD: Participants in the Australian Longitudinal Study on Women's Health who were aged 28-33 years in 2006 (n = 9,145) completed up to 4 mailed surveys over 10 years. Participants were categorized according to the recognized outcome of their pregnancies, including live birth, miscarriage/stillbirth, termination/ectopic, or no pregnancy. RESULTS: At age 18-23, more women reported terminations (7%) than miscarriages (4%). By 28-33 years, the cumulative frequency of miscarriage (15%) was as common as termination (16%). For women aged 28-33 years who had ever been pregnant (n = 5,343), pregnancy outcomes were as follows: birth only (50%); loss only (18%); and birth and loss (32%), of which half (16%) were birth and miscarriage. A comparison between first miscarriage and first birth (no miscarriage) showed that most first miscarriages occurred in women aged 18-23 years who also reported a first birth at the same survey (15%). Half (51%) of all first births and first miscarriages in women aged 18-19 ended in miscarriage. Early childbearers (<28 years) often had miscarriages around the same time period as their first live birth, suggesting proactive family formation. Delayed childbearers (32-33 years) had more first births than first miscarriages. CONCLUSION: Recognized pregnancy losses are an important measure of fecundity in the general population because they indicate successful conception and maintenance of pregnancy to varying reproductive endpoints.
Resumo:
Recently, a stream of project management research has recognized the critical role of boundary objects in the organization of projects. In this paper, we investigate how one advanced scheduling tool, the Integrated Master Schedule (IMS), is used as a temporal boundary object at various stages of complex projects. The IMS is critical to megaprojects which typically span long periods of time and face a high degree of complexity and uncertainty. In this paper, we conceptualize projects of this type as complex adaptive systems (CAS). We report the findings of four case projects on how the IMS mapped interactions, interdependencies, constraints, and fractal patterns of these emerging projects, and how the process of IMS visualization enabled communication and negotiation of project realities. This paper highlights that this advanced timeline tool acts as a boundary object and elicits shared understanding of complex projects from their stakeholders.
Resumo:
This paper provides an overview of the challenges faced by remote, rural and regional airports in Australia. The deregulation of airports over the past decades has resulted in local councils owning most of the rural and regional airports across Australia. The paper provides an overview of the international literature on regional airports and research directed at defining the issues faced by regional and rural airports in Australia. A survey of airport managers, regulators and local councils was undertaken across Australia to outline the challenges and stresses that regional airports are facing. Core findings indicate that the operation of rural and regional airports is under stress due to the interrelating factors of infrastructure costs, high cost of maintenance, and security infrastructure upgrades. Small airports often compete with one another to attract airlines and maintain their infrastructure advantage.
Resumo:
Precise protein quantification and recommendation is essential in clinical dietetics, particularly in the management of individuals with chronic kidney disease, malnutrition, burns, wounds, pressure ulcers, and those in active sports. The Expedited 10g Protein Counter (EP-10) was developed to simplify the quantification of dietary protein for assessment and recommendation of protein intake.1 Instead of using separate protein exchanges for different food groups to quantify the dietary protein intake of an individual, every exchange in the EP-10 accounts for an exchange each of 3g non-protein-rich food and 7g protein-rich food (Table 1). The EP-10 was recently validated and published in the Journal of Renal Nutrition recently.1 This study demonstrated that using the EP-10 for dietary protein intake quantification had clinically acceptable validity and reliability when compared with the conventional 7g protein exchange while requiring less time.2 In clinical practice, the use of efficient, accurate and practical methods to facilitate assessment and treatment plans is important. The EP-10 can be easily implemented in the nutrition assessment and recommendation for a patient in the clinical setting. This patient education tool was adapted from materials printed in the Journal of Renal Nutrition.1 The tool may be used as presented or adapted to assist patients to achieve their recommended daily protein intake.