61 resultados para Miguel Cané (p.)
Resumo:
The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.
Resumo:
Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.
Resumo:
Many examples of extreme virus resistance and posttranscriptional gene silencing of endogenous or reporter genes have been described in transgenic plants containing sense or antisense transgenes. In these cases of either cosuppression or antisense suppression, there appears to be induction of a surveillance system within the plant that specifically degrades both the transgene and target RNAs. We show that transforming plants with virus or reporter gene constructs that produce RNAs capable of duplex formation confer virus immunity or gene silencing on the plants. This was accomplished by using transcripts from one sense gene and one antisense gene colocated in the plant genome, a single transcript that has self-complementarity, or sense and antisense transcripts from genes brought together by crossing. A model is presented that is consistent with our data and those of other workers, describing the processes of induction and execution of posttranscriptional gene silencing.
Resumo:
The topic of “the cloud” has attracted significant attention throughout the past few years (Cherry 2009; Sterling and Stark 2009) and, as a result, academics and trade journals have created several competing definitions of “cloud computing” (e.g., Motahari-Nezhad et al. 2009). Underpinning this article is the definition put forward by the US National Institute of Standards and Technology, which describes cloud computing as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction” (Garfinkel 2011, p. 3). Despite the lack of consensus about definitions, however, there is broad agreement on the growing demand for cloud computing. Some estimates suggest that spending on cloudrelated technologies and services in the next few years may climb as high as USD 42 billion/year (Buyya et al. 2009).
Resumo:
The research seeks to address the current global water crisis and the built environments effect on the increasing demand for sustainability and water security. The fundamental question in determining the correct approach for water security in the built environment is whether government regulation and legislation could provide the framework for sustainable development and the conscious shift providing that change is the only perceivable option, there is no alternative. This article will attempt to analyse the value of the neo institutional theory as a method for directing individuals and companies to conform to water saving techniques. As is highlighted throughout the article, it will be investigated whether an incentive verse punishment approach to government legislations and regulations would provide the framework required to ensure water security within the built environment. Individuals and companies make certain choices or perform certain actions not because they fear punishment or attempt to conform; neither do they do so because an action is appropriate or feels some sort of social obligation. Instead, the cognitive element of neo institutionalism suggests that individuals make certain choices because they can conceive no alternative. The research seeks to identify whether sustainability and water security can become integrated into all aspects of design and architecture through the perception that 'there is no alternative.' This report seeks to address the omission of water security in the built environment by reporting on a series of investigations, interviews, literature reviews, exemplars and statistics relating to the built environment and the potential for increased water security. The results and analysis support the conclusions that through the support of government and local council, sustainability in the built environment could be achieved and become common practice for developments. Highlighted is the approach required for water management systems integration into the built environment and how these can be developed and maintained effectively between cities, states, countries and cultures.
Resumo:
The stillbirth of an Australian infant in the mid-20th Century was an event often left unacknowledged. Mothers of stillborn babies were often told to 'forget about it and have another baby.' Siblings of these babies were often not encouraged to discuss them, and were even left unaware of their birth and death. This paper explores this phenomenon in an Australian case study. When Nancy was born in 1937, her twin sister was stillborn. As was customary at that time, the deceased baby was buried unnamed in an unmarked plot without ceremony. Little was said of her thereafter. Seventy-three years later, Nancy finally undertook a number of activities with ritualised features that acknowledged, named, mourned and honoured her sister.
Resumo:
Background Preliminary research shows ginger may be an effective adjuvant treatment for chemotherapy-induced nausea and vomiting but significant limitations need to be addressed before recommendations for clinical practice can be made. Methods/Design In a double–blinded randomised-controlled trial, chemotherapy-naïve patients will be randomly allocated to receive either 1.2 g of a standardised ginger extract or placebo per day. The study medication will be administrated as an adjuvant treatment to standard anti-emetic therapy and will be divided into four capsules per day, to be consumed approximately every 4 hours (300 mg per capsule administered q.i.d) for five days during the first three cycles of chemotherapy. Acute, delayed, and anticipatory symptoms of nausea and vomiting will be assessed over this time frame using a valid and reliable questionnaire, with nausea symptoms being the primary outcome. Quality of life, nutritional status, adverse effects, patient adherence, cancer-related fatigue, and CINV-specific prognostic factors will also be assessed. Discussion Previous trials in this area have noted limitations. These include the inconsistent use of standardized ginger formulations and valid questionnaires, lack of control for anticipatory nausea and prognostic factors that may influence individual CINV response, and the use of suboptimal dosing regimens. This trial is the first to address these issues by incorporating multiple unique additions to the study design including controlling for CINV-specific prognostic factors by recruiting only chemotherapy-naïve patients, implementing a dosing schedule consistent with the pharmacokinetics of oral ginger supplements, and independently analysing ginger supplements before and after recruitment to ensure potency. Our trial will also be the first to assess the effect of ginger supplementation on cancer-related fatigue and nutritional status. Chemotherapy-induced nausea and vomiting are distressing symptoms experienced by oncology patients; this trial will address the significant limitations within the current literature and in doing so, will investigate the effect of ginger supplementation as an adjuvant treatment in modulating nausea and vomiting symptoms. Trial registration
Resumo:
Moving cell fronts are an essential feature of wound healing, development and disease. The rate at which a cell front moves is driven, in part, by the cell motility, quantified in terms of the cell diffusivity $D$, and the cell proliferation rate �$\lambda$. Scratch assays are a commonly-reported procedure used to investigate the motion of cell fronts where an initial cell monolayer is scratched and the motion of the front is monitored over a short period of time, often less than 24 hours. The simplest way of quantifying a scratch assay is to monitor the progression of the leading edge. Leading edge data is very convenient since, unlike other methods, it is nondestructive and does not require labeling, tracking or counting individual cells amongst the population. In this work we study short time leading edge data in a scratch assay using a discrete mathematical model and automated image analysis with the aim of investigating whether such data allows us to reliably identify $D$ and $\lambda$�. Using a naıve calibration approach where we simply scan the relevant region of the ($D$;$\lambda$�) parameter space, we show that there are many choices of $D$ and $\lambda$� for which our model produces indistinguishable short time leading edge data. Therefore, without due care, it is impossible to estimate $D$ and $\lambda$� from this kind of data. To address this, we present a modified approach accounting for the fact that cell motility occurs over a much shorter time scale than proliferation. Using this information we divide the duration of the experiment into two periods, and we estimate $D$ using data from the first period, while we estimate �$\lambda$ using data from the second period. We confirm the accuracy of our approach using in silico data and a new set of in vitro data, which shows that our method recovers estimates of $D$ and $\lamdba$� that are consistent with previously-reported values except that that our approach is fast, inexpensive, nondestructive and avoids the need for cell labeling and cell counting.
Resumo:
Thinking of cutting physical education? Think again. Even as we bemoan children's sedentary lifestyles, we often sacrifice school-based physical education in the name of providing more time for academics. In 2006, only 3.8 percent of elementary schools, 7.9 percent of middle schools, and 2.1 percent of high schools offered students daily physical education or its equivalent for the entire school year (Lee, Burgeson, Fulton, & Spain, 2007). We believe this marked reduction in school-based physical activity risks students' health and can't be justified on educational or ethical grounds. We'll get to the educational grounds in a moment. As to ethical reasons for keeping physical activity part of our young people's school days, consider the fact that childhood obesity is now one of the most serious health issues facing U.S. children (Ogden et al., 2006). School-based physical education programs engage students in regular physical activity and help them acquire skills and habits necessary to pursue an active lifestyle. Such programs are directly relevant to preventing obesity. Yet they are increasingly on the chopping block.
Resumo:
One cannot help but be impressed by the inroads that digital oilfield technologies have made into the exploration and production (E&P) industry in the past decade. Today’s production systems can be monitored by “smart” sensors that allow engineers to observe almost any aspect of performance in real time. Our understanding of how reservoirs are behaving has improved considerably since the dawn of this revolution, and the industry has been able to move away from point answers to more holistic “big picture” integrated solutions. Indeed, the industry has already reaped the rewards of many of these kinds of investments. Many billions of dollars of value have been delivered by this heightened awareness of what is going on within our assets and the world around them (Van Den Berg et al. 2010).
Resumo:
There is a wide variety of drivers for business process modelling initiatives, reaching from business evolution and process optimisation over compliance checking and process certification to process enactment. That, in turn, results in models that differ in content due to serving different purposes. In particular, processes are modelled on different abstraction levels and assume different perspectives. Vertical alignment of process models aims at handling these deviations. While the advantages of such an alignment for inter-model analysis and change propagation are out of question, a number of challenges has still to be addressed. In this paper, we discuss three main challenges for vertical alignment in detail. Against this background, the potential application of techniques from the field of process integration is critically assessed. Based thereon, we identify specific research questions that guide the design of a framework for model alignment.
Resumo:
Nanoparticles and low-temperature plasmas have been developed, independently and often along different routes, to tackle the same set of challenges in biomedicine. There are intriguing similarities and contrasts in their interactions with cells and living tissues, and these are reflected directly in the characteristics and scope of their intended therapeutic solutions, in particular their chemical reactivity, selectivity against pathogens and cancer cells, safety to healthy cells and tissues and targeted delivery to diseased tissues. Time has come to ask the inevitable question of possible plasma–nanoparticle synergy and the related benefits to the development of effective, selective and safe therapies for modern medicine. This perspective paper offers a detailed review of the strengths and weakenesses of nanomedicine and plasma medicine as a stand-alone technology, and then provides a critical analysis of some of the major opportunities enabled by synergizing nanotechnology and plasma technology. It is shown that the plasma–nanoparticle synergy is best captured through plasma nanotechnology and its benefits for medicine are highly promising.
Resumo:
Introduction The risk for late periprosthetic fractures is higher in patients treated for a neck of femur fracture compared to those treated for osteoarthritis. It has been hypothesised that osteopenia and consequent decreased stiffness of the proximal femur are responsible for this. We investigated if a femoral component with a bigger body would increase the torque to failure in a biaxially loaded composite sawbone model. Method A biomechanical composite sawbone model was used. Two different body sizes (Exeter 44-1 vs 44-4) of a polished tapered cemented stem were implanted by an experienced surgeon, in 7 sawbones each and loaded at 40 deg/s internal rotation until failure. Torque to fracture and fracture energy were measured using a biaxial materials testing device (Instron 8874). Data are non-parametric and tested with Mann-Whitney U-test. Results The mean torque load to fracture was 154.1 NM (SD 4.4) for the 44-1 stem and 229 NM (SD10.9) for the 44-4 stem (p = 0.01). The mean fracture energy was 9.6 J (SD1.2) for the 44-1 stem and 17.2 J (SD2.0) for the 44-4 stem (p = 0.14). Conclusion the use of a large body polished tapered cemented stem for neck of femur fractures increases the torque to failure in a biomechanical model and therefore is likely to reduce late periprosthetic fracture risk in this vulnerable cohort.
Resumo:
The majority of Escherichia coli strains isolated from urinary tract infections have the potential to express multiple fimbriae. Two of the most common fimbrial adhesins are type 1 fimbriae and pyelonephritis-associated pili (Pap). Previous research has shown that induced, plasmid-based expression of a Pap regulator, papB, and its close homologues can prevent inversion of the fim switch controlling the expression of type 1 fimbriae. The aim of the present study was to determine if this cross-regulation occurs when PapB is expressed from its native promoter in the chromosome of E. coli K-12 and clinical isolates. The regulation was examined in three ways: (1) mutated alleles of the pap regulatory region, including papB and papI, that maintain the pap promoter in either the off or the on phase were exchanged into the chromosome of both E. coli K-12 and the clinical isolate E. coli CFT073, and the effect on type 1 fimbrial expression was measured; (2) type 1 fimbrial expression was determined using a novel fimS : : gfp+ reporter system in mutants of the clinical isolate E. coli 536 in which combinations of complete fimbrial clusters had been deleted; (3) type 1 fimbrial expression was determined in a range of clinical isolates and compared with both the number of P clusters and their expression. All three approaches demonstrated that P expression represses type 1 fimbrial expression. Using a number of novel genetic approaches, this work extends the initial finding that PapB inhibits FimB recombination to the impact of this regulation in clinical isolates.
Resumo:
Objectives Commercial sex is licensed in Victoria, Australia such that sex workers are required to have regular tests for sexually transmitted infections (STIs). However, the incidence and prevalence of STIs in sex workers are very low, especially since there is almost universal condom use at work. We aimed to conduct a cost-effectiveness analysis of the financial cost of the testing policy versus the health benefits of averting the transmission of HIV, syphilis, chlamydia and gonorrhoea to clients. Methods We developed a simple mathematical transmission model, informed by conservative parameter estimates from all available data, linked to a cost-effectiveness analysis. Results We estimated that under current testing rates, it costs over $A90 000 in screening costs for every chlamydia infection averted (and $A600 000 in screening costs for each quality-adjusted life year (QALY) saved) and over $A4 000 000 for every HIV infection averted ($A10 000 000 in screening costs for each QALY saved). At an assumed willingness to pay of $A50 000 per QALY gained, HIV testing should not be conducted less than approximately every 40 weeks and chlamydia testing approximately once per year; in comparison, current requirements are testing every 12 weeks for HIV and every 4 weeks for chlamydia. Conclusions Mandatory screening of female sex workers at current testing frequencies is not cost-effective for the prevention of disease in their male clients. The current testing rate required of sex workers in Victoria is excessive. Screening intervals for sex workers should be based on local STI epidemiology and not locked by legislation.