938 resultados para BIS-1:11 MOLYBDOSLLICATE HETEROPOLYANION WITH DYSPROSIUM
Resumo:
Long-term loss of soil C stocks under conventional tillage and accrual of soil C following adoption of no-tillage have been well documented. No-tillage use is spreading, but it is common to occasionally till within a no-till regime or to regularly alternate between till and no-till practices within a rotation of different crops. Short-term studies indicate that substantial amounts of C can be lost from the soil immediately following a tillage event, but there are few field studies that have investigated the impact of infrequent tillage on soil C stocks. How much of the C sequestered under no-tillage is likely to be lost if the soil is tilled? What are the longer-term impacts of continued infrequent no-tillage? If producers are to be compensated for sequestering C in soil following adoption of conservation tillage practices, the impacts of infrequent tillage need to be quantified. A few studies have examined the short-term impacts of tillage on soil C and several have investigated the impacts of adoption of continuous no-tillage. We present: (1) results from a modeling study carried out to address these questions more broadly than the published literature allows, (2) a review of the literature examining the short-term impacts of tillage on soil C, (3) a review of published studies on the physical impacts of tillage and (4) a synthesis of these components to assess how infrequent tillage impacts soil C stocks and how changes in tillage frequency could impact soil C stocks and C sequestration. Results indicate that soil C declines significantly following even one tillage event (1-11 % of soil C lost). Longer-term losses increase as frequency of tillage increases. Model analyses indicate that cultivating and ripping are less disruptive than moldboard plowing, and soil C for those treatments average just 6% less than continuous NT compared to 27% less for CT. Most (80%) of the soil C gains of NT can be realized with NT coupled with biannual cultivating or ripping. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Norman K. Denzin (1989) claims that the central assumption of the biographical method—that a life can be captured and represented in a text—is open to question. This paper explores Denzin’s statement by documenting the role of creative writers in re-presenting oral histories in two case studies from Queensland, Australia. The first, The Queensland Business Leaders Hall of Fame, was a commercial research project commissioned by the State Library of Queensland (SLQ) in 2009, and involved semi-formal qualitative interviews and digital stories. The second is an on-going practice-led PhD project, The Artful Life: Oral History and Fiction, which investigates the fictionalisation of oral histories. Both projects enter into a dialogue around the re-presentation of oral and life histories, with attention given to the critical scholarship and creative practice in the process. Creative writers represent a life having particular preoccupations with techniques that more closely align with fiction than non-fiction (Hirsch and Dixon 2008). In this context, oral history resources are viewed not so much as repositories of historical facts, but as ambiguous and fluid narrative sources. The comparison of the two case studies also demonstrates that the aims of a particular project dictate the nature of the re-presentation, revealing that writing about another’s life is a complex act of artful ‘shaping’. Alistair Thomson (2007) notes the growing interdisciplinary nature of oral history scholarship since the 1980s; oral histories are used increasingly in art-based contexts to produce diverse cultural artefacts, such as digital stories and works of fiction, which are very different from traditional histories. What are the methodological implications of such projects? This paper will draw on self-reflexive practice to explore this question.
Resumo:
OBJECTIVE: To examine whether some drivers with hemianopia or quadrantanopia display safe driving skills on the road compared with drivers with normal visual fields. ---------- METHOD: An occupational therapist evaluated 22 people with hemianopia, 8 with quadrantanopia, and 30 with normal vision for driving skills during naturalistic driving using six rating scales. ---------- RESULTS: Of drivers with normal vision, >90% drove flawlessly or had minor errors. Although drivers with hemianopia were more likely to receive poorer ratings for all skills, 59.1%–81.8% performed with no or minor errors. A skill commonly problematic for them was lane keeping (40.9%). Of 8 drivers with quadrantanopia, 7 (87.5%) exhibited no or minor errors. ---------- CONCLUSION: This study of people with hemianopia or quadrantanopia with no lateral spatial neglect highlights the need to provide individual opportunities for on-road driving evaluation under natural traffic conditions if a person is motivated to return to driving after brain injury.
Resumo:
This interview with Paul Makeham was conducted in 2010 by Felipe Carneiro from Brazilian business magazine Exame. Structured around Exame's "seven questions" format ("Sete Perguntas"), the interview ranges across topics relating to the creative economy, including the increasingly important role of creativity in business, and the role of education in promoting creativity.
Resumo:
In a clinical setting, pain is reported either through patient self-report or via an observer. Such measures are problematic as they are: 1) subjective, and 2) give no specific timing information. Coding pain as a series of facial action units (AUs) can avoid these issues as it can be used to gain an objective measure of pain on a frame-by-frame basis. Using video data from patients with shoulder injuries, in this paper, we describe an active appearance model (AAM)-based system that can automatically detect the frames in video in which a patient is in pain. This pain data set highlights the many challenges associated with spontaneous emotion detection, particularly that of expression and head movement due to the patient's reaction to pain. In this paper, we show that the AAM can deal with these movements and can achieve significant improvements in both the AU and pain detection performance compared to the current-state-of-the-art approaches which utilize similarity-normalized appearance features only.
Resumo:
In 2006, the Faculty of Built Environment and Engineering introduced the first faculty wide unit dedicated to sustainability at any Australian University. BEB200 Introducing Sustainability has semester enrolments of up to 1500 students. Instruments such as lectures, readings, field visits, group projects and structured tutorial activities are used and have evolved over the last five years in response to student and staff feedback and attempts to better engage students. More than seventy staff have taught in the unit, which is in its final offering in this form in 2010. This paper reflects on the experiences of five academics who have played key roles in the development and teaching of this unit over the last five years. They argue that sustainability is a paradigm that allows students to explore other ways of knowing as they engage with issues in a complex world, not an end in itself. From the students’ perspective, grappling with such issues enables them to move towards a context in which they can understand their own discipline and its role in the contradictory and rapidly changing professional world. Insights are offered into how sustainability units may be developed in the future.
Resumo:
Background: People with cardiac disease and type 2 diabetes have higher hospital readmission rates (22%)compared to those without diabetes (6%). Self-management is an effective approach to achieve better health outcomes; however there is a lack of specifically designed programs for patients with these dual conditions. This project aims to extend the development and pilot test of a Cardiac-Diabetes Self-Management Program incorporating user-friendly technologies and the preparation of lay personnel to provide follow-up support. Methods/Design: A randomised controlled trial will be used to explore the feasibility and acceptability of the Cardiac-Diabetes Self-Management Program incorporating DVD case studies and trained peers to provide follow-up support by telephone and text-messaging. A total of 30 cardiac patients with type 2 diabetes will be randomised, either to the usual care group, or to the intervention group. Participants in the intervention group will received the Cardiac-Diabetes Self-Management Program in addition to their usual care. The intervention consists of three faceto- face sessions as well as telephone and text-messaging follow up. The face-to-face sessions will be provided by a trained Research Nurse, commencing in the Coronary Care Unit, and continuing after discharge by trained peers. Peers will follow up patients for up to one month after discharge using text messages and telephone support. Data collection will be conducted at baseline (Time 1) and at one month (Time 2). The primary outcomes include self-efficacy, self-care behaviour and knowledge, measured by well established reliable tools. Discussion: This paper presents the study protocol of a randomised controlled trial to pilot evaluates a Cardiac- Diabetes Self-Management program, and the feasibility of incorporating peers in the follow-ups. Results of this study will provide directions for using such mode in delivering a self-management program for patients with both cardiac condition and diabetes. Furthermore, it will provide valuable information of refinement of the intervention program.
Resumo:
In a report in the New York Times about a public symposium on the future of theory held at University of Chicago in 2002, staff writer Emily Eakin suggests that theory appears to have taken a back seat to more pressing current affairs – the Bush Administration, Al Qaeda, Iraq. Further, she reports that the symposium’s panel of high-profile theorists and scholars, including Homi Bhabha, Stanley Fish, Fredric Jameson, seemed reticent to offer their views on what is often touted as the demise or irrelevance of theory. The symposium and other commentaries on the topic of theory have prompted the view that the ‘Golden Age of Theory’ has passed and we are now in a ‘Post-Theory Age’. Given these pronouncements, we need to ask – Does theory matter any longer? Is it time for the obituary? Or are reports of the death of theory greatly exaggerated? The question remains whether to mourn or celebrate the demise of theory, and whether the body has in fact breathed its last. The title of this Introduction – ‘Bringing back theory’ – suggests a resurrection, or perhaps a haunting, as if the funeral has passed and, like Banquo’s ghost, theory returns to unsettle or disturb the celebration. It also suggests an entreaty, or perhaps a return performance. Rather than settle on one meaning, one interpretation, we are happy for all possibilities to coexist. The coexistence of different theories, different approaches, different interpretations also reflects the state of literary and cultural studies generally and children’s literature criticism in particular. No single theory or viewpoint predominates or vies for hegemony. Yet, one further question lingers – what is theory?
Resumo:
INTRODUCTION. Following anterior thoracoscopic instrumentation and fusion for the treatment of thoracic AIS, implant related complications have been reported as high as 20.8%. Currently the magnitudes of the forces applied to the spine during anterior scoliosis surgery are unknown. The aim of this study was to measure the segmental compressive forces applied during anterior single rod instrumentation in a series of adolescent idiopathic scoliosis patients. METHODS. A force transducer was designed, constructed and retrofitted to a surgical cable compression tool, routinely used to apply segmental compression during anterior scoliosis correction. Transducer output was continuously logged during the compression of each spinal joint, the output at completion converted to an applied compression force using calibration data. The angle between adjacent vertebral body screws was also measured on intra-operative frontal plane fluoroscope images taken both before and after each joint compression. The difference in angle between the two images was calculated as an estimate for the achieved correction at each spinal joint. RESULTS. Force measurements were obtained for 15 scoliosis patients (Aged 11-19 years) with single thoracic curves (Cobb angles 47˚- 67˚). In total, 95 spinal joints were instrumented. The average force applied for a single joint was 540 N (± 229 N)ranging between 88 N and 1018 N. Experimental error in the force measurement, determined from transducer calibration was ± 43 N. A trend for higher forces applied at joints close to the apex of the scoliosis was observed. The average joint correction angle measured by fluoroscope imaging was 4.8˚ (±2.6˚, range 0˚-12.6˚). CONCLUSION. This study has quantified in-vivo, the intra-operative correction forces applied by the surgeon during anterior single rod instrumentation. This data provides a useful contribution towards an improved understanding of the biomechanics of scoliosis correction. In particular, this data will be used as input for developing patient-specific finite element simulations of scoliosis correction surgery.
Resumo:
Background The vast sequence divergence among different virus groups has presented a great challenge to alignment-based analysis of virus phylogeny. Due to the problems caused by the uncertainty in alignment, existing tools for phylogenetic analysis based on multiple alignment could not be directly applied to the whole-genome comparison and phylogenomic studies of viruses. There has been a growing interest in alignment-free methods for phylogenetic analysis using complete genome data. Among the alignment-free methods, a dynamical language (DL) method proposed by our group has successfully been applied to the phylogenetic analysis of bacteria and chloroplast genomes. Results In this paper, the DL method is used to analyze the whole-proteome phylogeny of 124 large dsDNA viruses and 30 parvoviruses, two data sets with large difference in genome size. The trees from our analyses are in good agreement to the latest classification of large dsDNA viruses and parvoviruses by the International Committee on Taxonomy of Viruses (ICTV). Conclusions The present method provides a new way for recovering the phylogeny of large dsDNA viruses and parvoviruses, and also some insights on the affiliation of a number of unclassified viruses. In comparison, some alignment-free methods such as the CV Tree method can be used for recovering the phylogeny of large dsDNA viruses, but they are not suitable for resolving the phylogeny of parvoviruses with a much smaller genome size.
Resumo:
Introduction The ability to screen blood of early stage operable breast cancer patients for circulating tumour cells is of potential importance for identifying patients at risk of developing distant relapse. We present the results of a study of the efficacy of the immunobead RT-PCR method in identifying patients with circulating tumour cells. Results Immunomagnetic enrichment of circulating tumour cells followed by RT-PCR (immunobead RT-PCR) with a panel of five epithelial specific markers (ELF3, EPHB4, EGFR, MGB1 and TACSTD1) was used to screen for circulating tumour cells in the peripheral blood of 56 breast cancer patients. Twenty patients were positive for two or more RT-PCR markers, including seven patients who were node negative by conventional techniques. Significant increases in the frequency of marker positivity was seen in lymph node positive patients, in patients with high grade tumours and in patients with lymphovascular invasion. A strong trend towards improved disease free survival was seen for marker negative patients although it did not reach significance (p = 0.08). Conclusion Multi-marker immunobead RT-PCR analysis of peripheral blood is a robust assay that is capable of detecting circulating tumour cells in early stage breast cancer patients.
Resumo:
Agricultural soils emit about 50% of the global flux of N2O attributable to human influence, mostly in response to nitrogen fertilizer use. Recent evidence that the relationship between N2O fluxes and N-fertilizer additions to cereal maize are non-linear provides an opportunity to estimate regional N2O fluxes based on estimates of N application rates rather than as a simple percentage of N inputs as used by the Intergovernmental Panel on Climate Change (IPCC). We combined a simple empirical model of N2O production with the SOCRATES soil carbon dynamics model to estimate N2O and other sources of Global Warming Potential (GWP) from cereal maize across 19,000 cropland polygons in the North Central Region (NCR) of the US over the period 1964–2005. Results indicate that the loading of greenhouse gases to the atmosphere from cereal maize production in the NCR was 1.7 Gt CO2e, with an average 268 t CO2e produced per tonne of grain. From 1970 until 2005, GHG emissions per unit product declined on average by 2.8 t CO2e ha−1 annum−1, coinciding with a stabilisation in N application rate and consistent increases in grain yield from the mid-1970’s. Nitrous oxide production from N fertilizer inputs represented 59% of these emissions, soil C decline (0–30 cm) represented 11% of total emissions, with the remaining 30% (517 Mt) from the combustion of fuel associated with farm operations. Of the 126 Mt of N fertilizer applied to cereal maize from 1964 to 2005, we estimate that 2.2 Mt N was emitted as N2O when using a non-linear response model, equivalent to 1.75% of the applied N.
Resumo:
This study investigated the Kinaesthetic Fusion Effect (KFE) first described by Craske and Kenny in 1981. The current study did not replicate these findings. Participants did not perceive any reduction in the sagittal separation of a button pressed by the index finger of one arm and a probe touching the other, following repeated exposure to the tactile stimuli present on both unseen arms. This study’s failure to replicate the widely-cited KFE as described by Craske et al. (1984) suggests that it may be contingent on several aspects of visual information, especially the availability of a specific visual reference, the role of instructions regarding gaze direction, and the potential use of a line of sight strategy when referring felt positions to an interposed surface. In addition, a foreshortening effect was found; this may result from a line-of-sight judgment and represent a feature of the reporting method used. The transformed line of sight data were regressed against the participant reported values, resulting in a slope of 1.14 (right arm) and 1.11 (left arm), and r > 0.997 for each. The study also provides additional evidence that mis-perceptions of the mediolateral position of the limbs specifically their separation and consistent with notions of Gestalt grouping, is somewhat labile and can be influenced by active motions causing touch of one limb by the other. Finally, this research will benefit future studies that require participants to report the perceived locations of the unseen limbs.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
Background Pedometers have become common place in physical activity promotion, yet little information exists on who is using them. The multi-strategy, community-based 10,000 Steps Rockhampton physical activity intervention trial provided an opportunity to examine correlates of pedometer use at the population level. Methods Pedometer use was promoted across all intervention strategies including: local media, pedometer loan schemes through general practice, other health professionals and libraries, direct mail posted to dog owners, walking trail signage, and workplace competitions. Data on pedometer use were collected during the 2-year follow-up telephone interviews from random population samples in Rockhampton, Australia, and a matched comparison community (Mackay). Logistic regression analyses were used to determine the independent influence of interpersonal characteristics and program exposure variables on pedometer use. Results Data from 2478 participants indicated that 18.1% of Rockhampton and 5.6% of Mackay participants used a pedometer in the previous 18-months. Rockhampton pedometer users (n = 222) were more likely to be female (OR = 1.59, 95% CI: 1.11, 2.23), aged 45 or older (OR = 1.69, 95% CI: 1.16, 2.46) and to have higher levels of education (university degree OR = 4.23, 95% CI: 1.86, 9.6). Respondents with a BMI > 30 were more likely to report using a pedometer (OR = 1.68, 95% CI: 1.11, 2.54) than those in the healthy weight range. Compared with those in full-time paid work, respondents in 'home duties' were significantly less likely to report pedometer use (OR = 0.18, 95% CI: 0.06, 0.53). Exposure to individual program components, in particular seeing 10,000 Steps street signage and walking trails or visiting the website, was also significantly associated with greater pedometer use. Conclusion Pedometer use varies between population subgroups, and alternate strategies need to be investigated to engage men, people with lower levels of education and those in full-time 'home duties', when using pedometers in community-based physical activity promotion initiatives.