171 resultados para Missions to leprosy patients.
Resumo:
Background Nurses play a substantial role in the prevention and management of chemotherapy-induced nausea and vomiting (CINV). Objectives This study set out to describe nurses’ roles in the prevention and management of CINV and to identify any gaps that exist across countries. Methods A self-reported survey was completed by 458 registered nurses who administered chemotherapy to cancer patients in Australia, China, Hong Kong, and 9 Latin American countries. Results More than one-third of participants regarded their own knowledge of CINV as fair to poor. Most participants (>65%) agreed that chemotherapy-induced nausea and chemotherapy-induced vomiting should be considered separately (79%), but only 35% were confident in their ability to manage chemotherapy-induced nausea (53%) or chemotherapy-induced vomiting (59%). Only one-fifth reported frequent use of a standardized CINV assessment tool and only a quarter used international clinical guidelines to manage CINV. Conclusions Participants perceived their own knowledge of CINV management to be insufficient. They recognized the need to develop and use a standardized CINV assessment tool and the importance of adopting international guidelines to inform the management of CINV. Implications for Practice: Findings indicate that international guidelines should be made available to nurses in clinically relevant and easily accessible formats, that a review of chemotherapy assessment tools should be undertaken to identify reliable and valid measures amenable to use in a clinical settings, and that a CINV risk screening tool should be developed as a prompt for nurses to enable timely identification of and intervention for patients at high risk of CINV.
Resumo:
This tutorial primarily focuses on the technical challenges surrounding the design and implementation of Accountable-eHealth (AeH) systems. The potential benefits of shared eHealth records systems are promising for the future of improved healthcare; however, their uptake is hindered by concerns over the privacy and security of patient information. In the current eHealth environment, there are competing requirements between healthcare consumers' (i.e. patients) requirements and healthcare professionals' requirements. While consumers want control over their information, healthcare professionals want access to as much information as required in order to make well informed decisions. This conflict is evident in the review of Australia's PCEHR system. Accountable-eHealth systems aim to balance these concerns by implementing Information Accountability (IA) mechanisms. AeH systems create an eHealth environment where health information is available to the right person at the right time without rigid barriers whilst empowering the consumers with information control and transparency, thus, enabling the creation of shared eHealth records that can be useful to both patients and HCPs. In this half-day tutorial, we will discuss and describe the technical challenges surrounding the implementation of AeH systems and the solutions we have devised. A prototype AeH system will be used to demonstrate the functionality of AeH systems, and illustrate some of the proposed solutions. The topics that will be covered include: designing for usability in AeH systems, the privacy and security of audit mechanisms, providing for diversity of users, the scalability of AeH systems, and finally the challenges of enabling research and Big Data Analytics on shared eHealth Records while ensuring accountability and privacy are maintained.
Resumo:
Background: Cancer metastasis is the main contributor to breast cancer fatalities as women with the metastatic disease have poorer survival outcomes than women with localised breast cancers. There is an urgent need to develop appropriate prognostic methods to stratify patients based on the propensities of their cancers to metastasise. The insulin-like growth factor (IGF)-I:IGF binding protein (IGFBP):vitronectin complexes have been shown to stimulate changes in gene expression favouring increased breast cancer cell survival and a migratory phenotype. We therefore investigated the prognostic potential of these IGF- and extracellular matrix (ECM) interaction-induced proteins in the early identification of breast cancers with a propensity to metastasise using patient-derived tissue microarrays. Methods: Semiquantitative immunohistochemistry analyses were performed to compare the extracellular and subcellular distribution of IGF- and ECM-induced signalling proteins among matched normal, primary cancer and metastatic cancer formalin-fixed paraffin-embedded breast tissue samples. Results: The IGF- and ECM-induced signalling proteins were differentially expressed between subcellular and extracellular localisations. Vitronectin and IGFBP-5 immunoreactivity was lower while β1 integrin immunoreactivity was higher in the stroma surrounding metastatic cancer tissues, as compared to normal breast and primary cancer stromal tissues. Similarly, immunoreactive stratifin was found to be increased in the stroma of primary as well as metastatic breast tissues. Immunoreactive fibronectin and β1 integrin was found to be highly expressed at the leading edge of tumours. Based on the immunoreactivity it was apparent that the cell signalling proteins AKT1 and ERK1/2 shuffled from the nucleus to the cytoplasm with tumour progression. Conclusion: This is the first in-depth, compartmentalised analysis of the distribution of IGF- and ECM-induced signalling proteins in metastatic breast cancers. This study has provided insights into the changing pattern of cellular localisation and expression of IGF- and ECM-induced signalling proteins in different stages of breast cancer. The differential distribution of these biomarkers could provide important prognostic and predictive indicators that may assist the clinical management of breast disease, namely in the early identification of cancers with a propensity to metastasise, and/or recur following adjuvant therapy.
Resumo:
This paper reports on the 2nd ShARe/CLEFeHealth evaluation lab which continues our evaluation resource building activities for the medical domain. In this lab we focus on patients' information needs as opposed to the more common campaign focus of the specialised information needs of physicians and other healthcare workers. The usage scenario of the lab is to ease patients and next-of-kins' ease in understanding eHealth information, in particular clinical reports. The 1st ShARe/CLEFeHealth evaluation lab was held in 2013. This lab consisted of three tasks. Task 1 focused on named entity recognition and normalization of disorders; Task 2 on normalization of acronyms/abbreviations; and Task 3 on information retrieval to address questions patients may have when reading clinical reports. This year's lab introduces a new challenge in Task 1 on visual-interactive search and exploration of eHealth data. Its aim is to help patients (or their next-of-kin) in readability issues related to their hospital discharge documents and related information search on the Internet. Task 2 then continues the information extraction work of the 2013 lab, specifically focusing on disorder attribute identification and normalization from clinical text. Finally, this year's Task 3 further extends the 2013 information retrieval task, by cleaning the 2013 document collection and introducing a new query generation method and multilingual queries. De-identified clinical reports used by the three tasks were from US intensive care and originated from the MIMIC II database. Other text documents for Tasks 1 and 3 were from the Internet and originated from the Khresmoi project. Task 2 annotations originated from the ShARe annotations. For Tasks 1 and 3, new annotations, queries, and relevance assessments were created. 50, 79, and 91 people registered their interest in Tasks 1, 2, and 3, respectively. 24 unique teams participated with 1, 10, and 14 teams in Tasks 1, 2 and 3, respectively. The teams were from Africa, Asia, Canada, Europe, and North America. The Task 1 submission, reviewed by 5 expert peers, related to the task evaluation category of Effective use of interaction and targeted the needs of both expert and novice users. The best system had an Accuracy of 0.868 in Task 2a, an F1-score of 0.576 in Task 2b, and Precision at 10 (P@10) of 0.756 in Task 3. The results demonstrate the substantial community interest and capabilities of these systems in making clinical reports easier to understand for patients. The organisers have made data and tools available for future research and development.
Resumo:
Objective The move internationally by Governments and other health providers to encourage patients to have their own electronic personal health record (e-PHRs) is growing exponentially. In Australia the initiative for a personally controlled electronic health record (known as PCEHR) is directed towards the public at large. The first objective of this study then, is to examine how individuals in the general population perceive the promoted idea of having a PCEHR. The second objective is to extend research on applying a theoretically derived consumer technology acceptance model to guide the research. Method An online survey was conducted to capture the perceptions and beliefs about having a PCEHR identified from technology acceptance models and extant literature. The survey was completed by 750 Queensland respondents, 97% of whom did not have a PCEHR at that time. The model was examined using exploratory factor analysis, regressions and mediation tests. Results Findings support eight of the 11 hypothesised relationships in the model. Perceived value and perceived risk were the two most important variables explaining attitude, with perceived usefulness and compatibility being weak but significant. The perception of risk was reduced through partial mediation from trust and privacy concerns. Additionally, web-self efficacy and ease of use partially mediate the relationship between attitude and intentions. Conclusions The findings represent a snapshot of the early stages of implementing this Australian initiative and captures the perceptions of Queenslanders who at present do not have a PCEHR. Findings show that while individuals appreciate the value of having this record, they do not appear to regard it as particularly useful at present, nor is it particularly compatible with their current engagement with e-services. Moreover, they will need to have any concerns about the risks alleviated, particularly through an increased sense of trust and reduction of privacy concerns. It is noted that although the respondents are non-adopters, they do not feel that they lack the necessary web skills to set up and use a PCEHR. To the best of our knowledge this is one of a very limited number of studies that examines a national level implementation of an e-PHR system, where take-up of the PCEHR is optional rather than a centralised, mandated requirement.
Resumo:
Inhibitory control deficits are well documented in schizophrenia, supported by impairment in an established measure of response inhibition, the stop-signal reaction time (SSRT). We investigated the neural basis of this impairment by comparing schizophrenia patients and controls matched for age, sex and education on behavioural, functional magnetic resonance imaging (fMRI) and event-related potential (ERP) indices of stop-signal task performance. Compared to controls, patients exhibited slower SSRT and reduced right inferior frontal gyrus (rIFG) activation, but rIFG activation correlated with SSRT in both groups. Go stimulus and stop-signal ERP components (N1/P3) were smaller in patients, but the peak latencies of stop-signal N1 and P3 were also delayed in patients, indicating impairment early in stop-signal processing. Additionally, response-locked lateralised readiness potentials indicated response preparation was prolonged in patients. An inability to engage rIFG may predicate slowed inhibition in patients, however multiple spatiotemporal irregularities in the networks underpinning stop-signal task performance may contribute to this deficit.
Resumo:
Background A brief intervention, conducted in the acute setting care setting after an alcohol-related injury, has been reported to be highly beneficial in reducing the risk of re-injury and in reducing subsequent level of alcohol consumption. This project aimed to understand Australasian Oral and Maxillofacial Surgeons' attitudes, knowledge and skills in terms of alcohol screening and brief intervention within acute settings for patients admitted with facial trauma. Materials and Methods A web-based survey was made available to all members (n=200-250) of the Australian and New Zealand Association of Oral and Maxillofacial Surgeons (ANZAOMS), promoted through a number of email bulletins sent by the Association to all members. Implied consent is assumed for participants who complete the online survey. The survey explored their current level of involvement in treating patients with alcohol-relatd facial trauma, as well as their knowledge of and attitudes towards alcohol screening and brief intervention. The survey also explored their willingness for further training and involvement in implementing a SBI program. Parts of the survey were based on a hypothetical case with facial injury and drinking history which was presented to the participants and the participants were asked to give their response to this scenario. Results A total of 58 surgeons completed the on-line survey. 91% of surgeons surveyed were males and 88% were consultant surgeons. 71% would take alcohol history; 29% would deliver a brief alcohol intervention and 14% would refer the patients to an alcohol treatment service or clinician. 40% agreed to have adequate training in managing patients with alcohol-related injuries, while 17% and 19% felt they had adequate time and resources. 76% of surgeons reported the need for more information on where to refer patients for appropriate alcohol treatment. Conclusion The study findings confirm the challenges and barriers to implementing brief alcohol intervention in current practice. There are service gaps that exist, as well as opportunities for training.
Resumo:
Schizophrenia results in a profound disruption of one’s capacity to make sense of mental states, coherently narrate self-experiences, and meaningfully relate to others. While current treatment options for people with schizophrenia tend to be symptom-focused, experience in designing and implementing a study focusing on enhancing sense of self demonstrates the feasibility of developing and implementing models of treatment that prioritize the subjective distress and self-experience of people with schizophrenia. There is emerging research evidence, based upon dialogical theory of self, that posits the potential of people with deficits of self to engage in meaningful therapeutic relationships and work toward greater integrity of self and degrees of recovery. The challenge is to translate these ideas into a research methodology that can be successfully applied within therapeutic contexts with people who meet the diagnostic criteria for schizophrenia. Based upon dialogical theory, we developed a principle-based manual for metacognitive narrative psychotherapy: a psychological approach to the treatment of people with schizophrenia, which aims to enhance metacognitive capacity and ability to narrate self-experiences. Five phases of treatment were identified: (1) developing a therapeutic relationship, (2) eliciting narratives, (3) enhancing metacognitive capacity, (4) enriching narratives, and (5) living enriched stories. Proscribed practices were also identified. We then implemented the manual within a university clinic context. Six therapists were trained to implement the model and, in turn, provided therapy to 11 patients who completed 12 to 24 months of treatment. Participants were assessed on metacognitive capacity, narrative coherence, narrative richness, self-reported recovery, and symptomatology at three points in time over the course of therapy. Contrary to expectations, participants were highly engaged in the therapeutic process, with minimal dropout. Overall, over 75% of participants evidenced improvement in their level of recovery over the course of therapy. The manualization and outcome findings demonstrate the feasibility of applying such interventions to a broader clinical population.
Resumo:
Information and Communication Technologies are dramatically transforming Allopathic medicine. Technological developments including Tele-medicine, Electronic health records, Standards to ensure computer systems inter-operate, Data mining, Simulation, Decision Support and easy access to medical information each contribute to empowering patients in new ways and change the practice of medicine. To date, informatics has had little impact on Ayurvedic medicine. This tutorial provides an introduction to key informatics initiatives in Allopothic medicine using real examples and suggests how applications can be applied to Ayurvedic medicine.
Resumo:
Deprivation has previously been shown to be an independent risk factor for the high prevalence of malnutrition observed in COPD (Collins et al., 2010). It has been suggested the socioeconomic gradient observed in COPD is greater than any other chronic disease (Prescott & Vestbo, 1999). The current study aimed to examine the infl uence of disease severity and social deprivation on malnutrition risk in outpatients with COPD. 424 COPD outpatients were screened using the ‘Malnutrition Universal Screening Tool’ (‘MUST’). COPD disease severity was recorded in accordance with the GOLD criteria and deprivation was established according to the patient’s geographical location (postcode) at the time of nutritional screening using the UK Government’s Index of Multiple Deprivation (IMD). IMD ranks postcodes from 1 (most deprived) to 32,482 (least deprived). Disease severity was posi tively associated with an increased prevalence of malnutrition risk (p < 0.001) both within and between groups, whilst rank IMD was negatively associated with malnutrition (p = 0.020), i.e. those residing in less deprived areas were less likely to be malnourished. Within each category of disease severity the prevalence of malnutrition was two-fold greater in those residing in the most deprived areas compared to those residing in the least deprived areas. This study suggests that deprivation and disease severity are independent risk factors for malnutrition in COPD both contributing to the widely variable prevalence of malnutrition. Consideration of these issues could assist with the targeted nutritional management of these patients.
Resumo:
We sought to determine the impact of optometric practice setting on contact lens prescribing by analysing annual survey data of lens fits collected between 2009 and 2013 from independent and national group practices throughout the United Kingdom. Compared to national group practices, independent practices fit contact lenses to older patients and more females. Independent practices also undertake a lower proportion of soft lens fits overall (and thus a higher proportion of rigid lens fits), soft toric lens fits and daily disposable lens fits. There is a higher proportion of soft extended wear and multifocal lens fits in independent practices. We conclude that contact lens fitting behaviour is influenced by optometric practice setting.
Resumo:
Background: It is important to identify patients who are at risk of malnutrition upon hospital admission as malnutrition results in poor outcomes such as longer length of hospital stay, readmission, hospitalisation cost and mortality. The aim of this study was to determine the prognostic validity of 3-Minute Nutrition Screening (3-MinNS) in predicting hospital outcomes in patients admitted to an acute tertiary hospital through a list of diagnosis-related groups (DRG). Methods: In this study, 818 adult patients were screened for risk of malnutrition using 3-MinNS within 24 hours of admission. Mortality data was collected from the National Registry with other hospitalisation outcomes retrieved from electronic hospital records. The results were adjusted for age, gender and ethnicity, and matched for DRG. Results: Patients identified to be at risk of malnutrition (37%) using 3-MinNS had significant positive association with longer length of hospital stay (6.6 ± 7.1 days vs. 4.5 ± 5.5 days, p<0.001), higher hospitalisation cost (S$4540 ± 7190 vs. S$3630 ± 4961, p<0.001) and increased mortality rate at 1 year (27.8% vs. 3.9%), 2 years (33.8% vs. 7.2%) and 3 years (39.1% vs. 10.5%); p<0.001 for all. Conclusions: The 3-MinNS is able to predict clinical outcomes and can be used to screen newly admitted patients for nutrition risk so that appropriate nutrition assessment and early nutritional intervention can be initiated.
Resumo:
Aim This paper reports a study of workplace aggression among nurses in Tasmania, Australia. Background There is international concern about a perceived rise in occupational violence as a major worldwide public health problem, with associated financial costs. There is reason to suspect that aggression towards nurses is increasing. For example, increased illicit drug use puts nurses at the sharp end in managing patients admitted with drug-related problems. Such people are often resistant to healthcare intervention, and often have associated disorders, including mental illness. Despite this increased awareness, comprehensive data on occupational violence in nursing are not available. Method A specially designed questionnaire was sent to all nurses registered with the Nursing Board of Tasmania (n ¼ 6326) in November/December 2002, with 2407 usable questionnaires returned. The response rate was 38%. Findings A majority of respondents (63Æ5%) had experienced some form of aggression (verbal or physical abuse) in the four working weeks immediately prior to the survey. Patients/clients or their visitors were identified as the main perpetrators, followed by medical and nursing colleagues. Abuse influenced nurses’ distress, their desire to stay in nursing, their productivity and the potential to make errors, yet they were reluctant to make their complaints ‘official’. As well as reporting high levels of verbal and physical abuse, nurses were distressed because they could not provide the appropriate care to meet patients’ needs. Few working environments were free of aggression. Conclusion Future research should try to determine the specific factors, including staff characteristics and environment, associated with the high levels of aggression reported in ‘hot spots’ where, on the basis of the present results, many staff experience high levels of verbal and physical abuse. Unless managers take steps to improve the situation, attrition from the profession for this reason will continue.
Resumo:
Migraine and major depressive disorder (MDD) are comorbid, moderately heritable and to some extent influenced by the same genes. In a previous paper, we suggested the possibility of causality (one trait causing the other) underlying this comorbidity. We present a new application of polygenic (genetic risk) score analysis to investigate the mechanisms underlying the genetic overlap of migraine and MDD. Genetic risk scores were constructed based on data from two discovery samples in which genome-wide association analyses (GWA) were performed for migraine and MDD, respectively. The Australian Twin Migraine GWA study (N = 6,350) included 2,825 migraine cases and 3,525 controls, 805 of whom met the diagnostic criteria for MDD. The RADIANT GWA study (N = 3,230) included 1,636 MDD cases and 1,594 controls. Genetic risk scores for migraine and for MDD were used to predict pure and comorbid forms of migraine and MDD in an independent Dutch target sample (NTR-NESDA, N = 2,966), which included 1,476 MDD cases and 1,058 migraine cases (723 of these individuals had both disorders concurrently). The observed patterns of prediction suggest that the 'pure' forms of migraine and MDD are genetically distinct disorders. The subgroup of individuals with comorbid MDD and migraine were genetically most similar to MDD patients. These results indicate that in at least a subset of migraine patients with MDD, migraine may be a symptom or consequence of MDD. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
Design Proposal for the Blue Lunar Support Hub The conceptual design of a space station is one of the most challenging tasks in aerospace engineering. The history of the space station Mir and the assembly of the International Space Station demonstrate that even within the assembly phase quick solutions have to be found to cope with budget and technical problems or changing objectives. This report is the outcome of the conceptual design of the Space Station Design Workshop (SSDW) 2007, which took place as an international design project from the 16th to the 21st of July 2007 at the Australian Centre for Field Robotics (ACFR), University of Sydney, Australia. The participants were tasked to design a human-tended space station in low lunar orbit (LLO) focusing on supporting future missions to the moon in a programmatic context of space exploration beyond low Earth orbit (LEO). The design included incorporating elements from systems engineering to interior architecture. The customised, intuitive, rapid-turnaround software tools enabled the team to successfully tackle the complex problem of conceptual design of crewed space systems. A strong emphasis was put on improving the integration of the human crew, as it is the major contributor to mission success, while always respecting the boundary conditions imposed by the challenging environment of space. This report documents the methodology, tools and outcomes of the Space Station Design Workshop during the SSDW 2007. The design results produced by Team Blue are presented.