851 resultados para Biomedical databases
Resumo:
OBJECTIVE The aim of this research project was to obtain an understanding of the barriers to and facilitators of providing palliative care in neonatal nursing. This article reports the first phase of this research: to develop and administer an instrument to measure the attitudes of neonatal nurses to palliative care. METHODS The instrument developed for this research (the Neonatal Palliative Care Attitude Scale) underwent face and content validity testing with an expert panel and was pilot tested to establish temporal stability. It was then administered to a population sample of 1285 neonatal nurses in Australian NICUs, with a response rate of 50% (N 645). Exploratory factor-analysis techniques were conducted to identify scales and subscales of the instrument. RESULTS Data-reduction techniques using principal components analysis were used. Using the criteria of eigenvalues being 1, the items in the Neonatal Palliative Care Attitude Scale extracted 6 factors, which accounted for 48.1% of the variance among the items. By further examining the questions within each factor and the Cronbach’s of items loading on each factor, factors were accepted or rejected. This resulted in acceptance of 3 factors indicating the barriers to and facilitators of palliative care practice. The constructs represented by these factors indicated barriers to and facilitators of palliative care practice relating to (1) the organization in which the nurse practices, (2) the available resources to support a palliative model of care, and (3) the technological imperatives and parental demands. CONCLUSIONS The subscales identified by this analysis identified items that measured both barriers to and facilitators of palliative care practice in neonatal nursing. While establishing preliminary reliability of the instrument by using exploratory factor-analysis techniques, further testing of this instrument with different samples of neonatal nurses is necessary using a confirmatory factor-analysis approach.
Resumo:
Background: Noise is a significant barrier to sleep for acute care hospital patients, and sleep has been shown to be therapeutic for health, healing and recovery. Scheduled quiet time interventions to promote inpatient rest and sleep have been successfully trialled in critical care but not in acute care settings. Objectives: The study aim was to evaluate as cheduled quiet time intervention in an acute care setting. The study measured the effect of a scheduled quiet time on noise levels, inpatients’ rest and sleep behaviour, and wellbeing. The study also examined the impact of the intervention on patients’, visitors’ and health professionals’ satisfaction, and organisational functioning. Design: The study was a multi-centred non-randomised parallel group trial. Settings: The research was conducted in the acute orthopaedic wards of two major urban public hospitals in Brisbane, Australia. Participants: All patientsadmitted to the two wards in the5-month period of the study were invited to participate, withafinalsample of 299 participants recruited. This sample produced an effect size of 0.89 for an increase in the number of patients asleep during the quiet time. Methods: Demographic data were collected to enable comparison between groups. Data for noise level, sleep status, sleepiness and well being were collected using previously validated instruments: a Castle Model 824 digital sound level indicator; a three point sleep status scale; the Epworth Sleepiness Scale; and the SF12 V2 questionnaire. The staff, patient and visitor surveys on the experimental ward were adapted from published instruments. Results: Significant differences were found between the two groups in mean decibel level and numbers of patients awake and asleep. The difference in mean measured noise levels between the two environments corresponded to a ‘perceived’ difference of 2 to 1. There were significant correlations between average decibel level and number of patients awake and asleep in the experimental group, and between average decibel level and number of patients awake in the control group. Overall, patients, visitors and health professionals were satisfied with the quiet time intervention. Conclusions: The findings show that a quiet time intervention on an acute care hospital ward can affect noise level and patient sleep/wake patterns during the intervention period. The overall strongly positive response from surveys suggests that scheduled quiet time would be a positively perceived intervention with therapeutic benefit.
Resumo:
Objective: To assess extent of coder agreement for external causes of injury using ICD-10-AM for injury-related hospitalisations in Australian public hospitals. Methods: A random sample of 4850 discharges from 2002 to 2004 was obtained from a stratified random sample of 50 hospitals across four states in Australia. On-site medical record reviews were conducted and external cause codes were assigned blinded to the original coded data. Code agreement levels were grouped into the following agreement categories: block level, 3-character level, 4-character level, 5th-character level, and complete code level. Results: At a broad block level, code agreement was found in over 90% of cases for most mechanisms (eg, transport, fall). Percentage disagreement was 26.0% at the 3-character level; agreement for the complete external cause code was 67.6%. For activity codes, the percentage of disagreement at the 3-character level was 7.3% and agreement for the complete activity code was 68.0%. For place of occurrence codes, the percentage of disagreement at the 4-character level was 22.0%; agreement for the complete place code was 75.4%. Conclusions: With 68% agreement for complete codes and 74% agreement for 3-character codes, as well as variability in agreement levels across different code blocks, place and activity codes, researchers need to be aware of the reliability of their specific data of interest when they wish to undertake trend analyses or case selection for specific causes of interest.
Resumo:
Cholesterol-lowering treatment by statins is an important and costly issue; however, its role in stroke has not been well documented. The aim of the present study was to review literature and current practice regarding cholesterol-lowering treatment for stroke patients. A literature review was conducted on lipids in stroke and their management with both statins and diet, including the cost-effectiveness of medical nutrition therapy. Qualifying criteria and prescription procedures of the Pharmaceutical Benefits Scheme (PBS) were also reviewed. Data on lipid levels and statin prescriptions were analysed for 468 patients admitted to a stroke unit. The literature shows that management with both medication and diet can be effective, especially when combined; however, 60% of patients with an ischaemic event had fasting total cholesterol measures ≥4 mmol/L (n = 231), with only 52% prescribed statins on discharge (n = 120). Hypercholesterolaemia is an underdiagnosed and undertreated risk factor within the stroke population. It appears that the PBS has not kept pace with advances in the evidence in terms of statin use in the stroke population, and review is needed. The present review should address the qualifying criteria for the stroke population and recommendations on referral to dietitians for dietary advice. Cholesterol-lowering treatment for both stroke patients and the wider population is an area that needs awareness raising and review by the PBS, medical practitioners and dietitians. The role of dietary and pharmacological treatments needs to be clearly defined, including adjunct therapy, and the cost-effectiveness of medical nutrition therapy realised.
Resumo:
BACKGROUND: The murine ghrelin gene (Ghrl), originally sequenced from stomach tissue, contains five exons and a single transcription start site in a short, 19 bp first exon (exon 0). We recently isolated several novel first exons of the human ghrelin gene and found evidence of a complex transcriptional repertoire. In this report, we examined the 5' exons of the murine ghrelin orthologue in a range of tissues using 5' RACE. -----FINDINGS: 5' RACE revealed two transcription start sites (TSSs) in exon 0 and four TSSs in intron 0, which correspond to 5' extensions of exon 1. Using quantitative, real-time RT-PCR (qRT-PCR), we demonstrated that extended exon 1 containing Ghrl transcripts are largely confined to the spleen, adrenal gland, stomach, and skin. -----CONCLUSION: We demonstrate that multiple transcription start sites are present in exon 0 and an extended exon 1 of the murine ghrelin gene, similar to the proximal first exon organisation of its human orthologue. The identification of several transcription start sites in intron 0 of mouse ghrelin (resulting in an extension of exon 1) raises the possibility that developmental-, cell- and tissue-specific Ghrl mRNA species are created by employing alternative promoters and further studies of the murine ghrelin gene are warranted.
Resumo:
This study assesses the Vitamin D status of 126 healthy free-living adults aged 18–87 years, in southeast Queensland, Australia (27°S) at the end of the 2006 winter. Participants provided blood samples for analysis of 25(OH)D (the measure of an individual’s Vitamin D status), PTH, Calcium, Phosphate, and Albumin, completed a questionnaire on sun-protective/sun-exposure behaviours, and were assessed for phenotypic characteristics such as skin/hair/eye colour and BMI. We found that 10.2% of the participants had serum 25(OH)D levels below 25 nmol/l (considered deficient) and a further 32.3% had levels between 25 nmol/l and 50 nmol/l (considered insufficient). Our results show that low levels of 25(OH)D can occur in a substantial proportion of the population at the end of winter, even in a sunny climate. 25(OH)D levels were higher amongst those who spent more time in the sun and lower among obese participants (BMI > 30) than those who were not obese (BMI < 30). 25(OH)D levels were also lower in participants who had black hair, dark/olive skin, or brown eyes, when compared with participants who had brown or fair hair, fair skin, or blue/green eyes. No associations were found between 25(OH)D status and age, gender, smoking status, or the use of sunscreen.
Resumo:
Catheter-related bloodstream infections are a serious problem. Many interventions reduce risk, and some have been evaluated in cost-effectiveness studies. We review the usefulness and quality of these economic studies. Evidence is incomplete, and data required to inform a coherent policy are missing. The cost-effectiveness studies are characterized by a lack of transparency, short time-horizons, and narrow economic perspectives. Data quality is low for some important model parameters. Authors of future economic evaluations should aim to model the complete policy and not just single interventions. They should be rigorous in developing the structure of the economic model, include all relevant economic outcomes, use a systematic approach for selecting data sources for model parameters, and propagate the effect of uncertainty in model parameters on conclusions. This will inform future data collection and improve our understanding of the economics of preventing these infections.
Mitigating surgical risk in patients undergoing hip arthroplasty for fractures of the proximal femur
Resumo:
Recently the National Patient Safety Agency in the United Kingdom published a report entitled "Mitigating surgical risk in patients undergoing hip arthroplasty for fractures of the proximal femur". A total of 26 deaths had been reported to them when cement was used at hemiarthroplasty between October 2003 and October 2008. This paper considers the evidence for using cement fixation of a hemiarthroplasty in the treatment of hip fractures.
Resumo:
PURPOSE: To introduce techniques for deriving a map that relates visual field locations to optic nerve head (ONH) sectors and to use the techniques to derive a map relating Medmont perimetric data to data from the Heidelberg Retinal Tomograph. METHODS: Spearman correlation coefficients were calculated relating each visual field location (Medmont M700) to rim area and volume measures for 10 degrees ONH sectors (HRT III software) for 57 participants: 34 with glaucoma, 18 with suspected glaucoma, and 5 with ocular hypertension. Correlations were constrained to be anatomically plausible with a computational model of the axon growth of retinal ganglion cells (Algorithm GROW). GROW generated a map relating field locations to sectors of the ONH. The sector with the maximum statistically significant (P < 0.05) correlation coefficient within 40 degrees of the angle predicted by GROW for each location was computed. Before correlation, both functional and structural data were normalized by either normative data or the fellow eye in each participant. RESULTS: The model of axon growth produced a 24-2 map that is qualitatively similar to existing maps derived from empiric data. When GROW was used in conjunction with normative data, 31% of field locations exhibited a statistically significant relationship. This significance increased to 67% (z-test, z = 4.84; P < 0.001) when both field and rim area data were normalized with the fellow eye. CONCLUSIONS: A computational model of axon growth and normalizing data by the fellow eye can assist in constructing an anatomically plausible map connecting visual field data and sectoral ONH data.
The STRATIFY tool and clinical judgment were poor predictors of falling in an acute hospital setting
Resumo:
Objective: To compare the effectiveness of the STRATIFY falls tool with nurses’ clinical judgments in predicting patient falls. Study Design and Setting: A prospective cohort study was conducted among the inpatients of an acute tertiary hospital. Participants were patients over 65 years of age admitted to any hospital unit. Sensitivity, specificity, and positive predictive value (PPV) and negative predictive values (NPV) of the instrument and nurses’ clinical judgments in predicting falls were calculated. Results: Seven hundred and eighty-eight patients were screened and followed up during the study period. The fall prevalence was 9.2%. Of the 335 patients classified as being ‘‘at risk’’ for falling using the STRATIFY tool, 59 (17.6%) did sustain a fall (sensitivity50.82, specificity50.61, PPV50.18, NPV50.97). Nurses judged that 501 patients were at risk of falling and, of these, 60 (12.0%) fell (sensitivity50.84, specificity50.38, PPV50.12, NPV50.96). The STRATIFY tool correctly identified significantly more patients as either fallers or nonfallers than the nurses (P50.027). Conclusion: Considering the poor specificity and high rates of false-positive results for both the STRATIFY tool and nurses’ clinical judgments, we conclude that neither of these approaches are useful for screening of falls in acute hospital settings.
Resumo:
Background Centers for Disease Control Guidelines recommend replacement of peripheral intravenous (IV) catheters every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bacteraemia. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. Objectives To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely.
Resumo:
Information and communication technologies (ICTs) had occupied their position on knowledge management and are now evolving towards the era of self-intelligence (Klosterman, 2001). In the 21st century ICTs for urban development and planning are imperative to improve the quality of life and place. This includes the management of traffic, waste, electricity, sewerage and water quality, monitoring fire and crime, conserving renewable resources, and coordinating urban policies and programs for urban planners, civil engineers, and government officers and administrators. The handling of tasks in the field of urban management often requires complex, interdisciplinary knowledge as well as profound technical information. Most of the information has been compiled during the last few years in the form of manuals, reports, databases, and programs. However frequently, the existence of these information and services are either not known or they are not readily available to the people who need them. To provide urban administrators and the public with comprehensive information and services, various ICTs are being developed. In early 1990s Mark Weiser (1993) proposed Ubiquitous Computing project at the Xerox Palo Alto Research Centre in the US. He provides a vision of a built environment which digital networks link individual residents not only to other people but also to goods and services whenever and wherever they need (Mitchell, 1999). Since then the Republic of Korea (ROK) has been continuously developed national strategies for knowledge based urban development (KBUD) through the agenda of Cyber Korea, E-Korea and U-Korea. Among abovementioned agendas particularly the U-Korea agenda aims the convergence of ICTs and urban space for a prosperous urban and economic development. U-Korea strategies create a series of U-cities based on ubiquitous computing and ICTs by a means of providing ubiquitous city (U-city) infrastructure and services in urban space. The goals of U-city development is not only boosting the national economy but also creating value in knowledge based communities. It provides opportunity for both the central and local governments collaborate to U-city project, optimize information utilization, and minimize regional disparities. This chapter introduces the Korean-led U-city concept, planning, design schemes and management policies and discusses the implications of U-city concept in planning for KBUD.
Resumo:
Police services in a number of Australian states and overseas jurisdictions have begun to implement or consider random road-side drug testing of drivers. This paper outlines research conducted to provide an estimate of the extent of drug driving in a sample of Queensland drivers in regional, rural and metropolitan areas. Oral fluid samples were collected from 2657 Queensland motorists and screened for illicit substances including cannabis (delta 9 tetrahydrocannibinol [THC]), amphetamines, ecstasy, and cocaine. Overall, 3.8% of the sample (n = 101) screened positive for at least one illicit substance, although multiple drugs were identified in a sample of 23 respondents. The most common drugs detected in oral fluid were ecstasy (n = 53), and cannabis (n = 46) followed by amphetamines (n = 23). A key finding was that cannabis was confirmed as the most common self-reported drug combined with driving and that individuals who tested positive to any drug through oral fluid analysis were also more likely to report the highest frequency of drug driving. Furthermore, a comparison between drug vs. drink driving detection rates for one region of the study, revealed a higher detection rate for drug driving (3.8%) vs. drink driving (0.8%). This research provides evidence that drug driving is relatively prevalent on Queensland roads, and may in fact be more common than drink driving. This paper will further outline the study findings’ and present possible directions for future drug driving research.
Resumo:
Driving under the influence (DUI) is a major road safety problem. Historically, alcohol has been assumed to play a larger role in crashes and DUI education programs have reflected this assumption, although recent evidence suggests that younger drivers are becoming more likely to drive drugged than to drive drunk. This is a study of 7096 Texas clients under age 21 who were admitted to state-funded treatment programs between 1997 and 2007 with a past-year DUI arrest, DUI probation, or DUI referral. Data were obtained from the State’s administrative dataset. Multivariate logistic regressions models were used to understand the differences between those minors entering treatment as a DUI as compared to a non-DUI as well as the risks for completing treatment and for being abstinent in the month prior to follow-up. A major finding was that over time, the primary problem for underage DUI drivers changed from alcohol to marijuana. Being abstinent in the month prior to discharge, having a primary problem with alcohol rather than another drug, and having more family involved were the strongest predictors of treatment completion. Living in a household where the client was exposed to alcohol abuse or drug use, having been in residential treatment, and having more drug and alcohol and family problems were the strongest predictors of not being abstinent at follow-up. As a result, there is a need to direct more attention towards meeting the needs of the young DUI population through programs that address drug as well as alcohol consumption problems.