907 resultados para Raymond Vigue
Resumo:
Background Radiation-induced skin reaction (RISR) is a common side effect that affects the majority of cancer patients receiving radiation treatment. RISR is often characterised by swelling,redness, pigmentation, fibrosis, and ulceration, pain, warmth, burning, and itching of the skin. The aim of this systematic review was to assess the effects of interventions which aim to prevent or manage RISR in people with cancer. Methods We searched the following databases up to November 2012: Cochrane Skin Group Specialised Register, CENTRAL (2012, Issue 11), MEDLINE (from 1946), EMBASE (from 1974), PsycINFO (from 1806), CINAHL (from 1981) and LILACS (from 1982). Randomized controlled trials evaluating interventions for preventing or managing RISR in cancer patients were included. The primary outcomes were development of RISR, and levels of RISR and symptom severity. Secondary outcomes were time taken to develop erythema or dry desquamation; quality of life; time taken to heal, a number of skin reaction and symptom severity measures; cost, participant satisfaction; ease of use and adverse effects. Where appropriate, we pooled results of randomized controlled trials using mean differences (MD) or odd ratios (OR) with 95% confidence intervals (CI). Results Forty-seven studies were included in this review. These evaluated six types of interventions (oral systemic medications; skin care practices; steroidal topical therapies; non-steroidal topical therapies; dressings and other). Findings from two meta-analyses demonstrated significant benefits of oral Wobe-Mugos E for preventing RISR (OR 0.13 (95% CI 0.05 to 0.38)) and limiting the maximal level of RISR (MD −0.92 (95% CI −1.36 to −0.48)). Another meta-analysis reported that wearing deodorant does not influence the development of RISR (OR 0.80 (95% CI 0.47 to 1.37)). Conclusions Despite the high number of trials in this area, there is limited good, comparative research that provides definitive results suggesting the effectiveness of any single intervention for reducing RISR. More research is required to demonstrate the usefulness of a wide range of products that are being used for reducing RISR. Future efforts for reducing RISR severity should focus on promising interventions, such as Wobe-Mugos E and oral zinc.
Resumo:
This thesis demonstrated that race mattered as a contributing factor to the low Indigenous participation rates within the Australian Public Service. The thesis showed that the public service reproduced social relations privileging non-Indigenous executives while positioning Indigenous executives as deficient. The thesis explains how the everydayness of racism assumes the racial neutrality of institutions because the concept of race is externalised as only having relevance to the racial other. Non-Indigenous executives regard Indigeneity as being synonymous with inferiority to explain Indigenous disadvantage. Consequently, the Indigenous experience of everyday racism is perpetuated and contributes to declining rates of employment.
Resumo:
In this research paper, we study a simple programming problem that only requires knowledge of variables and assignment statements, and yet we found that some early novice programmers had difficulty solving the problem. We also present data from think aloud studies which demonstrate the nature of those difficulties. We interpret our data within a neo-Piagetian framework which describes cognitive developmental stages through which students pass as they learn to program. We describe in detail think aloud sessions with novices who reason at the neo-Piagetian preoperational level. Those students exhibit two problems. First, they focus on very small parts of the code and lose sight of the "big picture". Second, they are prone to focus on superficial aspects of the task that are not functionally central to the solution. It is not until the transition into the concrete operational stage that decentration of focus occurs, and they have the cognitive ability to reason about abstract quantities that are conserved, and are equipped to adapt skills to closely related tasks. Our results, and the neo-Piagetian framework on which they are based, suggest that changes are necessary in teaching practice to better support novices who have not reached the concrete operational stage.
Resumo:
Recent research from within a neo-Piagetian perspective proposes that novice programmers pass through the sensorimotor and preoperational stages before being able to reason at the concrete operational stage. However, academics traditionally teach and assess introductory programming as if students commence at the concrete operational stage. In this paper, we present results from a series of think aloud sessions with a single student, known by the pseudonym “Donald”. We conducted the sessions mainly over one semester, with an additional session three semesters later. Donald first manifested predominately sensorimotor reasoning, followed by preoperational reasoning, and finally concrete operational reasoning. This longitudinal think aloud study of Donald is the first direct observational evidence of a novice programmer progressing through the neo-Piagetian stages.
Resumo:
Stigmergy is a biological term used when discussing a sub-set of insect swarm-behaviour describing the apparent organisation seen during their activities. Stigmergy describes a communication mechanism based on environment-mediated signals which trigger responses among the insects. This phenomenon is demonstrated in the behavior of ants and their food gathering process when following pheromone trails, where the pheromones are a form of environment-mediated communication. What is interesting with this phenomenon is that highly organized societies are achieved without an apparent management structure. Stigmergy is also observed in human environments, both natural and engineered. It is implicit in the Web where sites provide a virtual environment supporting coordinative contributions. Researchers in varying disciplines appreciate the power of this phenomenon and have studied how to exploit it. As stigmergy becomes more widely researched we see its definition mutate as papers citing original work become referenced themselves. Each paper interprets these works in ways very specific to the research being conducted. Our own research aims to better understand what improves the collaborative function of a Web site when exploiting the phenomenon. However when researching stigmergy to develop our understanding we discover a lack of a standardized and abstract model for the phenomenon. Papers frequently cited the same generic descriptions before becoming intimately focused on formal specifications of an algorithm, or esoteric discussions regarding sub-facets of the topic. None provide a holistic and macro-level view to model and standardize the nomenclature. This paper provides a content analysis of influential literature documenting the numerous theoretical and experimental papers that have focused on stigmergy. We establish that stigmergy is a phenomenon that transcends the insect world and is more than just a metaphor when applied to the human world. We present from our own research our general theory and abstract model of semantics of stigma in stigmergy. We hope our model will clarify the nuances of the phenomenon into a useful road-map, and standardise vocabulary that we witness becoming confused and divergent. Furthermore, this paper documents the analysis on which we base our next paper: Special Theory of Stigmergy: A Design Pattern for Web 2.0 Collaboration.
Resumo:
This book comprises 11 chapters, alternating between two authors (a patient with metastatic pancreatic cancer and an oncologist)...
Resumo:
Recent studies have linked the ability of novice (CS1) programmers to read and explain code with their ability to write code. This study extends earlier work by asking CS2 students to explain object-oriented data structures problems that involve recursion. Results show a strong correlation between ability to explain code at an abstract level and performance on code writing and code reading test problems for these object-oriented data structures problems. The authors postulate that there is a common set of skills concerned with reasoning about programs that explains the correlation between writing code and explaining code. The authors suggest that an overly exclusive emphasis on code writing may be detrimental to learning to program. Non-code writing learning activities (e.g., reading and explaining code) are likely to improve student ability to reason about code and, by extension, improve student ability to write code. A judicious mix of code-writing and code-reading activities is recommended.
Resumo:
This editorial depicts the current challenges in palliative care provision for patients with a haematological malignancy and the contribution of cancer nurses. There have been significant advancements in the care of patients with a hematological malignancy over the past three or more decades1. Despite this, there still exists a significant mortality risk in curative treatment and many patients with a hematological malignancy will die from their disease1. A growing body of research indicates patients with a hematological malignancy do not receive best practice palliative and end-of-life care2. Shortfalls in care include poor referral patterns to specialist palliative care services, lack of honest discussions regarding death and dying, inadequate spiritual care for patients and families, patients frequently dying in the acute care setting and high levels of patient and family distress2. There have been a number of efforts in the United Kingdom, United States of America, Sweden, and Australia demonstrating palliative and hematology care can co-exist, exemplified through clinical case studies and innovative models of care2. However, deficits in the provision of palliative care for patients with a hematological malignancy persist as evident in the international literature2. Addressing this issue requires research exploring new aspects of a complex scenario; here we suggest priority areas of research...
Resumo:
Cancer-related fatigue is one of the most distressing symptoms experienced by patients with advanced cancer. This doctoral study identified that patients with advanced cancer commonly use a number of self-management strategies in response to fatigue, although these strategies had varying levels of effectiveness in reducing the symptom. The study identified that enhancing self-efficacy and managing depressive symptoms are important factors to consider in the design of future interventions to support fatigue self-management.
Resumo:
This is the protocol for a review and there is no abstract. The objectives are as follows: To assess the effects of education programmes for skin cancer prevention in the general population. Description of the condition Skin cancer is a term that includes both melanoma and keratinocyte cancer. Keratinocyte cancer (also known as nonmelanoma skin cancer) generally refers to basal cell carcinoma (BCC) and squamous cell carcinoma (SCC), although it also includes other rare cutaneous neoplasms (Madan 2010). Skin cancer is the most common cancer in populations of predominantly fair-skinned people (Donaldson 2011; Lomas 2012; Stern 2010), with incidence increasing (Garbe 2009; Leiter 2012). There are variations in annual incidence rates between these populations, with Australia reporting the highest rate of skin cancer in the world (Lomas 2012). In 2012, the estimated age-standardised incidence rate for melanoma was almost 63 per 100,000 people for Australian men, and 40 per 100,000 people for Australian women (AIHW 2012). In Europe, incidence rates range from 10 to 15 per 100,000 people (Garbe 2009; Lasithiotakis 2006), with rates highest amongst men (Stang 2006). In the United States, incidence rates are approximately 18 per 100,000 people (Garbe 2009),with the highest rates reported forwomen (Bradford 2010). Keratinocyte cancer is much more common than melanoma. In 2012, the estimated Australian age-standardised rates for BCCand SCC were 884 and 387 per 100,000 people, respectively (Staples 2006). The cumulative three-year risk of developing a subsequent keratinocyte cancer is 18% for SCC and 44% for BCC (Marcil 2000).
Resumo:
This thesis developed new search engine models that elicit the meaning behind the words found in documents and queries, rather than simply matching keywords. These new models were applied to searching medical records: an area where search is particularly challenging yet can have significant benefits to our society.
Resumo:
This Perspective reflects on the withdrawal of the Liverpool Care Pathway in the UK, and its implications for Australia. Integrated care pathways are documents which outline the essential steps of multidisciplinary care in addressing a specific clinical problem. They can be used to introduce best clinical practice, to ensure that the most appropriate management occurs at the most appropriate time and that it is provided by the most appropriate health professional. By providing clear instructions, decision support and a framework for clinician-patient interactions, care pathways guide the systematic provision of best evidence-based care. The Liverpool Care Pathway (LCP) is an example of an integrated care pathway, designed in the 1990s to guide care for people with cancer who are in their last days of life and are expected to die in hospital. This pathway evolved out of a recognised local need to better support non-specialist palliative care providers’ care for patients dying of cancer within their inpatient units. Historically, despite the large number of people in acute care settings whose treatment intent is palliative, dying patients receiving general hospital acute care tended to lack sufficient attention from senior medical staff and nursing staff. The quality of end-of-life care was considered inadequate, therefore much could be learned from the way patients were cared for by palliative care services. The LCP was a strategy developed to improve end-of-life care in cancer patients and was based on the care received by those dying in the palliative care setting.
Resumo:
Background Few cancers pose greater challenges than head and neck (H&N) cancer. Residual effects following treatment include body image changes, pain, fatigue and difficulties with appetite, swallowing and speech. Depression is a common comorbidity. There is limited evidence about ways to assist patients to achieve optimal adjustment after completion of treatment. In this study, we aim to examine the effectiveness and feasibility of a model of survivorship care to improve the quality of life of patients who have completed treatment for H&N cancer. Methods This is a preliminary study in which 120 patients will be recruited. A prospective randomised controlled trial of the H&N Cancer Survivor Self-management Care Plan (HNCP) involving pre- and post-intervention assessments will be used. Consecutive patients who have completed a defined treatment protocol for H&N cancer will be recruited from two large cancer services and randomly allocated to one of three study arms: (1) usual care, (2) information in the form of a written resource or (3) the HNCP delivered by an oncology nurse who has participated in manual-based training and skill development in patient self-management support. The trained nurses will meet patients in a face-to-face interview lasting up to 60 minutes to develop an individualised HNCP, based on principles of chronic disease self-management. Participants will be assessed at baseline, 3 and 6 months. The primary outcome measure is quality of life. The secondary outcome measures include mood, self-efficacy and health-care utilisation. The feasibility of implementing this intervention in routine clinical care will be assessed through semistructured interviews with participating nurses, managers and administrators. Interviews with patients who received the HNCP will explore their perceptions of the HNCP, including factors that assisted them in achieving behavioural change. Discussion In this study, we aim to improve the quality of life of a patient population with unique needs by means of a tailored self-management care plan developed upon completion of treatment. Delivery of the intervention by trained oncology nurses is likely to be acceptable to patients and, if successful, will be a model of care that can be implemented for diverse patient populations.
Resumo:
Purpose To investigate the effects of a natural oil-based emulsion containing allantoin versus aqueous cream for preventing and managing radiation induced skin reactions (RISR). Methods and Materials A total of 174 patients were randomised and participated in the study. Patients either received Cream 1 (the natural oil-based emulsion containing allantoin) or Cream 2 (aqueous cream). Skin toxicity, pain, itching and skin-related quality of life scores were collected for up to four weeks after radiation treatment. Results Patients who received Cream 1 had a significantly lower average level of Common Toxicity Criteria at week 3 (p<0.05), but had statistically higher average levels of skin toxicity at weeks 7, 8 and 9 (all p<0.001). Similar results were observed when skin toxicity was analysed by grades. With regards to pain, patients in the Cream 2 group had a significantly higher average level of worst pain (p<0.05) and itching (p=0.046) compared to the Cream 1 group at week 3, however these differences were not observed at other weeks. In addition, there was a strong trend for Cream 2 to reduce the incidence of grade 2 or more skin toxicity in comparison to Cream 1 (p=0.056). Overall, more participants in the Cream 1 group were required to use another topical treatment at weeks 8 (p=0.049) and 9 (p=0.01). Conclusion The natural oil-based emulsion containing allantoin appears to have similar effects for managing skin toxicity compared to aqueous cream up to week 5, however, it becomes significantly less effective at later weeks into the radiation treatment and beyond treatment completion (week 6 and beyond). There were no major differences in pain, itching and skin-related quality of life. In light of these results, clinicians and patients can base their decision on costs and preferences. Overall, aqueous cream appears to be a more preferred option.
Resumo:
The cotton strip assay (CSA) is an established technique for measuring soil microbial activity. The technique involves burying cotton strips and measuring their tensile strength after a certain time. This gives a measure of the rotting rate, R, of the cotton strips. R is then a measure of soil microbial activity. This paper examines properties of the technique and indicates how the assay can be optimised. Humidity conditioning of the cotton strips before measuring their tensile strength reduced the within and between day variance and enabled the distribution of the tensile strength measurements to approximate normality. The test data came from a three-way factorial experiment (two soils, two temperatures, three moisture levels). The cotton strips were buried in the soil for intervals of time ranging up to 6 weeks. This enabled the rate of loss of cotton tensile strength with time to be studied under a range of conditions. An inverse cubic model accounted for greater than 90% of the total variation within each treatment combination. This offers support for summarising the decomposition process by a single parameter R. The approximate variance of the decomposition rate was estimated from a function incorporating the variance of tensile strength and the differential of the function for the rate of decomposition, R, with respect to tensile strength. This variance function has a minimum when the measured strength is approximately 2/3 that of the original strength. The estimates of R are almost unbiased and relatively robust against the cotton strips being left in the soil for more or less than the optimal time. We conclude that the rotting rate X should be measured using the inverse cubic equation, and that the cotton strips should be left in the soil until their strength has been reduced to about 2/3.