505 resultados para Clinical consequences
em Queensland University of Technology - ePrints Archive
Resumo:
Ad[I/PPT-E1A] is an oncolytic adenovirus that specifically kills prostate cells via restricted replication by a prostate-specific regulatory element. Off-target replication of oncolytic adenoviruses would have serious clinical consequences. As a proposed ex vivo test, we describe the assessment of the specificity of Ad[I/PPT-E1A] viral cytotoxicity and replication in human nonprostate primary cells. Four primary nonprostate cell types were selected to mimic the effects of potential in vivo exposure to Ad[I/PPT-E1A] virus: bronchial epithelial cells, urothelial cells, vascular endothelial cells, and hepatocytes. Primary cells were analyzed for Ad[I/PPT-E1A] viral cytotoxicity in MTS assays, and viral replication was determined by hexon titer immunostaining assays to quantify viral hexon protein. The results revealed that at an extreme multiplicity of infection of 500, unlikely to be achieved in vivo, Ad[I/PPT-E1A] virus showed no significant cytotoxic effects in the nonprostate primary cell types apart from the hepatocytes. Transmission electron microscopy studies revealed high levels of Ad[I/PPT-E1A] sequestered in the cytoplasm of these cells. Adenoviral green fluorescent protein reporter studies showed no evidence for nuclear localization, suggesting that the cytotoxic effects of Ad[I/PPT-E1A] in human primary hepatocytes are related to viral sequestration. Also, hepatocytes had increased amounts of coxsackie adenovirus receptor surface protein. Active viral replication was only observed in the permissive primary prostate cells and LNCaP prostate cell line, and was not evident in any of the other nonprostate cells types tested, confirming the specificity of Ad[I/PPT-E1A]. Thus, using a relevant panel of primary human cells provides a convenient and alternative preclinical assay for examining the specificity of conditionally replicating oncolytic adenoviruses in vivo.
Resumo:
Selenium (Se) is an essential trace element and the clinical consequences of Se deficiency have been well-documented. Se is primarily obtained through the diet and recent studies have suggested that the level of Se in Australian foods is declining. Currently there is limited data on the Se status of the Australian population so the aim of this study was to determine the plasma concentration of Se and glutathione peroxidase (GSH-Px), a well-established biomarker of Se status. Furthermore, the effect of gender, age and presence of cardiovascular disease (CVD) was also examined. Blood plasma samples from healthy subjects (140 samples, mean age = 54 years; range, 20-86 years) and CVD patients (112 samples, mean age = 67 years; range, 40-87 years) were analysed for Se concentration and GSH-Px activity. The results revealed that the healthy Australian cohort had a mean plasma Se level of 100.2 +/- 1.3 microg Se/L and a mean GSH-Px activity of 108.8 +/- 1.7 U/L. Although the mean value for plasma Se reached the level required for optimal GSH-Px activity (i.e. 100 microg Se/L), 47% of the healthy individuals tested fell below this level. Further evaluation revealed that certain age groups were more at risk of a lowered Se status, in particular, the oldest age group of over 81 years (females = 97.6 +/- 6.1 microg Se/L; males = 89.4 +/- 3.8 microg Se/L). The difference in Se status between males and females was not found to be significant. The presence of CVD did not appear to influence Se status, with the exception of the over 81 age group, which showed a trend for a further decline in Se status with disease (plasma Se, 93.5 +/- 3.6 microg Se/L for healthy versus 88.2 +/- 5.3 microg Se/L for CVD; plasma GSH-Px, 98.3 +/- 3.9 U/L for healthy versus 87.0 +/- 6.5 U/L for CVD). These findings emphasise the importance of an adequate dietary intake of Se for the maintenance of a healthy ageing population, especially in terms of cardiovascular health.
Resumo:
Background Indigenous children in high-income countries have a heavy burden of bronchiectasis unrelated to cystic fibrosis. We aimed to establish whether long-term azithromycin reduced pulmonary exacerbations in Indigenous children with non-cystic-fibrosis bronchiectasis or chronic suppurative lung disease. Methods Between Nov 12, 2008, and Dec 23, 2010, we enrolled Indigenous Australian, Maori, and Pacific Island children aged 1—8 years with either bronchiectasis or chronic suppurative lung disease into a multicentre, double-blind, randomised, parallel-group, placebo-controlled trial. Eligible children had had at least one pulmonary exacerbation in the previous 12 months. Children were randomised (1:1 ratio, by computer-generated sequence with permuted block design, stratified by study site and exacerbation frequency [1—2 vs ≥3 episodes in the preceding 12 months]) to receive either azithromycin (30 mg/kg) or placebo once a week for up to 24 months. Allocation concealment was achieved by double-sealed, opaque envelopes; participants, caregivers, and study personnel were masked to assignment until after data analysis. The primary outcome was exacerbation (respiratory episodes treated with antibiotics) rate. Analysis of the primary endpoint was by intention to treat. At enrolment and at their final clinic visits, children had deep nasal swabs collected, which we analysed for antibiotic-resistant bacteria. This study is registered with the Australian New Zealand Clinical Trials Registry; ACTRN12610000383066. Findings 45 children were assigned to azithromycin and 44 to placebo. The study was stopped early for feasibility reasons on Dec 31, 2011, thus children received the intervention for 12—24 months. The mean treatment duration was 20·7 months (SD 5·7), with a total of 902 child-months in the azithromycin group and 875 child-months in the placebo group. Compared with the placebo group, children receiving azithromycin had significantly lower exacerbation rates (incidence rate ratio 0·50; 95% CI 0·35—0·71; p<0·0001). However, children in the azithromycin group developed significantly higher carriage of azithromycin-resistant bacteria (19 of 41, 46%) than those receiving placebo (four of 37, 11%; p=0·002). The most common adverse events were non-pulmonary infections (71 of 112 events in the azithromycin group vs 132 of 209 events in the placebo group) and bronchiectasis-related events (episodes or investigations; 22 of 112 events in the azithromycin group vs 48 of 209 events in the placebo group); however, study drugs were well tolerated with no serious adverse events being attributed to the intervention. Interpretation Once-weekly azithromycin for up to 24 months decreased pulmonary exacerbations in Indigenous children with non-cystic-fibrosis bronchiectasis or chronic suppurative lung disease. However, this strategy was also accompanied by increased carriage of azithromycin-resistant bacteria, the clinical consequences of which are uncertain, and will need careful monitoring and further study.
Resumo:
Cystic fibrosis (CF) patients require pancreatic enzyme replacement therapy to correct pancreatic insufficiency. These enzymes are derived from porcine pancreas and are known to be antigenic. To determine the possible clinical consequences, a specific ELISA was developed to detect IgG antibody directed against porcine trypsin (PTAb) in the sera of CF patients. The assay was used to evaluate the occurrence of PTAb in a cross sectional study of 103 CF patients in relation to the introduction of porcine enzyme therapy, clinical status and genotype. Antibodies against porcine trypsin were detected in the sera of 63% of patients unrelated to the age of commencement or the duration of enzyme therapy. No differences were observed in the clinical status of CF patients who had developed PTAb (n = 65) and those who had no detectable PTAb (n = 38) as determined from: the current prescribed dose of porcine pancreatic enzyme capsules; Z scores for height and weight; and respiratory function tests. It is suggested that the PTAb commonly found in the sera of CF patients are of doubtful clinical significance but the prospect of PTAb contributing to immune complex disease should be examined further.
Resumo:
Fusionless scoliosis surgery is an emerging treatment for idiopathic scoliosis as it offers theoretical advantages over current forms of treatment. Anterior vertebral stapling using a nitinol staple is one such treatment. Despite increasing interest in this technique, little is known about the effects on the spine following insertion, or the mechanism of action of the staple. The aims of this study were threefold; (1) to measure changes in the bending stiffness of a single motion segment following staple insertion, (2) to describe the forces that occur within the staple during spinal movement, and (3) to describe the anatomical changes that occur following staple insertion. Results suggest that staple insertion consistently decreased stiffness in all directions of motion. An explanation for the finding may be found in the outcomes of the strain gauge testing and micro-CT scan. The strain gauge testing showed that once inserted, the staple tips applied a baseline compressive force to the surrounding trabecular bone and vertebral end-plate. This finding would be consistent with the current belief that the clinical effect of the staples is via unilateral compression of the physis. Interestingly however, as each specimen progressed through the five cycles of each test, the baseline load on the staple tips gradually decreased, implying that the force at the staple tip-bone interface was decreasing. We believe that this was likely occurring as a result of structural damage to the trabecular bone and vertebral end-plate by the staple effectively causing ‘loosening’ of the staple. This hypothesis is further supported by the findings of the micro-CT scan. The pictures depict significant trabecular bone and physeal injury around the staple blades. These results suggest that the current hypothesis that stapling modulates growth through physeal compression may be incorrect, but rather the effect occurs through mechanical disruption of the vertebral growth plate.
Resumo:
Leucodepletion, the removal of leucocytes from blood products improves the safety of blood transfusion by reducing adverse events associated with the incidental non-therapeutic transfusion of leucocytes. Leucodepletion has been shown to have clinical benefit for immuno-suppressed patients who require transfusion. The selective leucodepletion of blood products by bed side filtration for these patients has been widely practiced. This study investigated the economic consequences in Queensland of moving from a policy of selective leucodepletion to one of universal leucodepletion, that is providing all transfused patients with blood products leucodepleted during the manufacturing process. Using an analytic decision model a cost-effectiveness analysis was conducted. An ICER of $16.3M per life year gained was derived. Sensitivity analysis found this result to be robust to uncertainty in the parameters used in the model. This result argues against moving to a policy of universal leucodepletion. However during the course of the study the policy decision for universal leucodepletion was made and implemented in Queensland in October 2008. This study has concluded that cost-effectiveness is not an influential factor in policy decisions regarding quality and safety initiatives in the Australian blood sector.
Resumo:
The current study sought to understand adolescent protective behavior in friendship using a Theory of Planned Behavior framework. In particular, the study sought to consider a young persons’ direct and active intervention to inhibit their friends’ risky behavior or to assist them when the behavior leads to injury. The role of attitudes regarding the consequences, norms and control about protective behavior were examined both qualitatively through focus groups (n= 50) and quantitatively through surveys from a sample of 540 Year 9 students (13-14 years old). There was some support for the theory with attitudes regarding the consequences of the behavior and norms predicting intended protective behavior. A path analysis was conducted with a sub-sample of 140 students which showed that intentions to be protective and perceived control to undertake protective behavior directly predicted such behavior after a 3 month interval. Attitudes towards the consequences and norms only indirectly predicted protective behavior via intention. The findings provide important applied information for interventions designed to increase adolescent protective behavior in their friendships.
Resumo:
- Background Substance use is common among gay/bisexual men and is associated with significant health risks (e.g. HIV transmission). The consequences of substance use, across the range of substances commonly used, have received little attention. The purpose of this study is to map participant’s beliefs about the effects of substance use to inform prevention, health promotion and clinical interventions. - Methods Participants were interviewed about experiences regarding their substance use and recruited through medical and sexual health clinics. Data were collected though a consumer panel and individual interviews. Responses regarding perceived consequences of substance use were coded using Consensual Qualitative Research (CQR) methodology. - Results Most participants reported lifetime use of alcohol, cannabis, stimulants and amyl nitrite, and recent alcohol and cannabis use. A wide range of themes were identified regarding participant’s thoughts, emotions and behaviours (including sexual behaviours) secondary to substance use, including: cognitive functioning, mood, social interaction, physical effects, sexual activity, sexual risk-taking, perception of sexual experience, arousal, sensation, relaxation, disinhibition, energy/activity level and numbing. Analyses indicated several consequences were consistent across substance types (e.g. cognitive impairment, enhanced mood), whereas others were highly specific to a given substance (e.g. heightened arousal post amyl nitrite use). - Conclusions Prevention and interventions need to consider the variety of effects of substance use in tailoring effective education programs to reduce harms. A diversity of consequences appear to have direct and indirect impacts on decision-making, sexual activity and risk-taking. Findings lend support for the role of specific beliefs (e.g. expectancies) related to substance use on risk-related cognitions, emotions and behaviours.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
Schweitzer et al. previously published a paper in the Australian and New Zealand Journal of Psychiatry which provided prevalence rates on suicidal ideation and behaviour among university students [1]. We wish to provide an update on extensions of our previously published work. In our previous publication we indicated the relatively high percentage of students who reported suicide-related behaviour over the past 12 months (6.6%). This figure is very similar to a more recent study undertaken in the UK where 6% of student respondents reported suicide attempts [2]. As a follow up, we investigated this finding further in studies undertaken in 1994 and 1997 by asking fresh samples of University of Queensland first-year undergraduates who responded positively to the question ‘I have made attempts to kill myself’ (in the past year), to provide additional data relating to the methods employed in their suicide attempts and the consequences following their suicide attempt in terms of level of injury and medical care received...
Resumo:
Rapidly developing proteomic tools are improving detection of deregulated kallikrein-related peptidase (KLK) expression, at the protein level, in prostate and ovarian cancer, as well as facilitating the determination of functional consequences downstream. Mass spectrometry (MS)-driven proteomics uniquely allows for the detection, identification and quantification of thousands of proteins in a complex protein pool, and this has served to identify certain KLKs as biomarkers for these diseases. In this review we describe applications of this technology in KLK biomarker discovery, and elucidate MS-based techniques which have been used for unbiased, global screening of KLK substrates within complex protein pools. Although MS-based KLK degradomic studies are limited to date, they helped to discover an array of novel KLK substrates. Substrates identified by MS-based degradomics are reported with improved confidence over those determined by incubating a purified or recombinant substrate and protease of interest, in vitro. We propose that these novel proteomic approaches represent the way forward for KLK research, in order to correlate proteolysis of biological substrates with tissue-related consequences, toward clinical targeting of KLK expression and function for cancer diagnosis, prognosis and therapies.
Resumo:
With measurement of physical activity becoming more common in clinical practice, it is imperative that healthcare professionals become more knowledgeable about the different methods available to objectively measure physical activity behaviour. Objective measures do not rely on information provided by the patient, but instead measure and record the biomechanical or physiological consequences of performing physical activity, often in real time. As such, objective measures are not subject to the reporting bias or recall problems associated with self-report methods. The purpose of this article was to provide an overview of the different methods used to objectively measure physical activity in clinical practice. The review was delimited to heart rate monitoring, accelerometers and pedometers since their small size, low participant burden and relatively low cost make these objective measures appropriate for use in clinical practice settings. For each measure, strengths and weakness were discussed; and whenever possible, literature-based examples of implementation were provided.
Resumo:
OBJECTIVE To investigate the impact of new-onset diabetic ketoacidosis (DKA) during child- hood on brain morphology and function. RESEARCH DESIGN AND METHODS Patients aged 6–18 years with and without DKA at diagnosis were studied at four time points: <48 h, 5 days, 28 days, and 6 months postdiagnosis. Patients under- went magnetic resonance imaging (MRI) and spectroscopy with cognitive assess- ment at each time point. Relationships between clinical characteristics at presentation and MRI and neurologic outcomes were examined using multiple linear regression, repeated-measures, and ANCOVA analyses. RESULTS Thirty-six DKA and 59 non-DKA patients were recruited between 2004 and 2009. With DKA, cerebral white matter showed the greatest alterations with increased total white matter volume and higher mean diffusivity in the frontal, temporal, and parietal white matter. Total white matter volume decreased over the first 6 months. For gray matter in DKA patients, total volume was lower at baseline and increased over 6 months. Lower levels of N-acetylaspartate were noted at base- line in the frontal gray matter and basal ganglia. Mental state scores were lower at baseline and at 5 days. Of note, although changes in total and regional brain volumes over the first 5 days resolved, they were associated with poorer delayed memory recall and poorer sustained and divided attention at 6 months. Age at time of presentation and pH level were predictors of neuroimaging and functional outcomes. CONCLUSIONS DKA at type 1 diabetes diagnosis results in morphologic and functional brain changes. These changes are associated with adverse neurocognitive outcomes in the medium term.
Resumo:
The importance of the isoform CYP2E1 of the human cytochrome P-450 superfamily of enzymes for occupational and environmental medicine is derived from its unique substrate spectrum that includes a number of highly important high-production chemicals, such as aliphatic and aromatic hydrocarbons, solvents and industrial monomers (i.a. alkanes, alkenes, aromatic and halogenated hydrocarbons). Many polymorphic genes, such as CYP2E1, show considerable differences in allelic distribution between different human populations. The polymorphic nature of the human CYP2E1 gene is significant for inter-individual differences in toxicity of its substrates. Since the substrate spectrum of CYP2E1 includes many compounds of basic relevance to industrial toxicology, a rationale for metabolic interactions of different CYP2E1 substrates is provided. In-depth research into the inter-individual phenotypic differences of human CYP2E1 enzyme activities was enabled by the recognition that the 6-hydroxylation of the drug chlorzoxazone is mediated by CYP2E1. Studies on CYP2E1 phenotyping have pointed to inter-individual variations in enzyme activities. There are consistent ethnic differences in CYP2E1 enzyme expression, mostly demonstrated between European and Japanese populations, which point to a major impact of genetic factors. The most frequently studied genetic polymorphisms are the restriction fragment length polymorphisms PstI/RsaI (mutant allele: CYP2E1*5B) located in the 5′-flanking region of the gene, as well as the DraI polymorphism (mutant allele: CYP2E1*6) located in intron 6. These polymorphisms are partly related, as they form the common allele designated CYP2E1*5A. Striking inter-ethnic differences between Europeans and Asians appear with respect to the frequencies of the CYP2E1*5A allele (only approximately 5% of Europeans are heterozygous, but 37% of Asians are, whilst 6% of Asians are homozygous). Available studies indicate a wide variation in human CYP2E1 expression, which are very likely based on complex gene-environment interactions. Major inter-ethnic differences are apparent on the genotyping and the phenotyping levels. Selected cases are presented where inter-ethnic variations of CYP2E1 may provide likely explanations for unexplained findings concerning industrial chemicals that are CYP2E1 substrates. Possible consequences of differential inter-individual and inter-ethnic susceptibilities are related to individual expressions of clinical symptoms of chemical toxicity, to results of biological monitoring of exposed workers, and to the interpretation of results of epidemiological or molecular-epidemiological studies.
Resumo:
Background Brominated flame retardants (BFRs), are chemicals widely used in consumer products including electronics, vehicles, plastics and textiles to reduce flammability. Experimental animal studies have confirmed that these compounds may interfere with thyroid hormone homeostasis and neurodevelopment but to date health effects in humans have not been systematically examined. Objectives To conduct a systematic review of studies on the health impacts of exposure to BFRs in humans, with a particular focus on children. Methods A systematic review was conducted using the Medline and EMBASE electronic databases up to 1 February 2012. Published cohort, cross-sectional, and case-control studies exploring the relationship between BFR exposure and various health outcomes were included. Results In total, 36 epidemiological studies meeting the pre-determined inclusion criteria were included. Plausible outcomes associated with BFR exposure include diabetes, neurobehavioral and developmental disorders, cancer, reproductive health effects and alteration in thyroid function. Evidence for a causal relationship between exposure to BFRs and health outcomes was evaluated within the Bradford Hill framework. Conclusion Although there is suggestive evidence that exposure to BFRs is harmful to health, further epidemiological investigations particularly among children, and long-term monitoring and surveillance of chemical impacts on humans are required to confirm these relationships.