994 resultados para Genetics, Medical
Resumo:
Typical migraine is a complex neurological disorder comprised of two main subtypes: migraine with (MA) and without aura (MO). The disease etiology is still unclear, but family studies provide strong evidence that defective genes play an important role. Familial hemiplegic migraine (FHM) is a very rare and severe subtype of MA. It has been proposed that FHM and MA may have a similar genetic etiology. Therefore, genetic studies on FHM provide a useful model for investigating the more prevalent types of typical migraine. FHM in some families has been shown to be caused by mutations in a brain-specific P/Q-type calcium channel alpha1 subunit gene (CACNA1A) on chromosome 19p13. There has also been a report of a CACNA1A mutation being associated with MA in a patient from a family with predominant FHM. We have previously demonstrated suggestive linkage of typical migraine in a large Australian family to the FHM region on chromosome 19p13. These findings suggest that CACNA1A may also be implicated in the etiology of typical migraine in this pedigree. To investigate this possibility, we sequenced two patients carrying the critical susceptibility haplotype surrounding CACNA1A. No disease-causing mutations or polymorphisms were revealed in any of the 47 exons screened. To determine whether the CACNA1A gene was implicated in typical migraine susceptibility in the general Caucasian population, we also analyzed 82 independent pedigrees and a large case control group. We did not detect any linkage or association in these groups and conclude that if CACNA1A plays a role in typical migraine, it does not confer a major effect on the disease.
Resumo:
Balancing the competing interests of autonomy and protection of individuals is an escalating challenge confronting an ageing Australian society. Legal and medical professionals are increasingly being asked to determine whether individuals are legally competent/capable to make their own testamentary and substitute decision-making, that is financial and/or personal/health care, decisions. No consistent and transparent competency/capacity assessment paradigm currently exists in Australia. Consequently, assessments are currently being undertaken on an ad hoc basis which is concerning as Australia’s population ages and issues of competency/capacity increase. The absence of nationally accepted competency/capacity assessment guidelines and supporting principles results in legal and medical professionals involved with competency/capacity assessment implementing individual processes tailored to their own abilities. Legal and medical approaches differ both between and within the professions. The terminology used also varies. The legal practitioner is concerned with whether the individual has the legal ability to make the decision. A medical practitioner assesses fluctuations in physical and mental abilities. The problem is that the terms competency and capacity are used interchangeably resulting in confusion about what is actually being assessed. The terminological and methodological differences subsequently create miscommunication and misunderstanding between the professions. Consequently, it is not necessarily a simple solution for a legal professional to seek the opinion of a medical practitioner when assessing testamentary and/or substitute decision-making competency/capacity. This research investigates the effects of the current inadequate testamentary and substitute decision-making assessment paradigm and whether there is a more satisfactory approach. This exploration is undertaken within a framework of therapeutic jurisprudence which promotes principles fundamentally important in this context. Empirical research has been undertaken to first, explore the effects of the current process with practising legal and medical professionals; and second, to determine whether miscommunication and misunderstanding actually exist between the professions such that it gives rise to a tense relationship which is not conducive to satisfactory competency/capacity assessments. The necessity of reviewing the adequacy of the existing competency/capacity assessment methodology in the testamentary and substitute decision-making domain will be demonstrated and recommendations for the development of a suitable process made.
Resumo:
The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.
Resumo:
Migraine is a common complex disorder, currently classified into two main subtypes, migraine with aura (MA) and migraine without aura (MO). The strong preponderance of females to males suggests an X-linked genetic component. Recent studies have identified an X chromosomal susceptibility region (Xq24-q28) in two typical migraine pedigrees. This region harbours a potential candidate gene for the disorder, the serotonin receptor 2C (5-HT2C) gene. This study involved a linkage and association approach to investigate two single nucleotide variants in the 5-HT2C gene. In addition, exonic coding regions of the 5-HT2C gene were also sequenced for mutations in X-linked migraine pedigrees. Results of this study did not detect any linkage or association, and no disease causing mutations were identified. Hence, results for this study do not support a significant role of the 5-HT 2C gene in migraine predisposition. © 2003 Wiley-Liss, Inc.
Resumo:
Interest in chromosome 18 in essential hypertension comes from comparative mapping of rat blood pressure quantitative trait loci (QTL), familial orthostatic hypotensive syndrome studies, and essential hypertension pedigree linkage analyses indicating that a locus or loci on human chromosome 18 may play a role in hypertension development. To further investigate involvement of chromosome 18 in human essential hypertension, the present study utilized a linkage scan approach to genotype twelve microsatellite markers spanning human chromosome 18 in 177 Australian Caucasian hypertensive (HT) sibling pairs. Linkage analysis showed significant excess allele sharing of the D18S61 marker when analyzed with SPLINK (P=0.00012), ANALYZE (Sibpair) (P=0.0081), and also with MAPMAKER SIBS (P=0.0001). Similarly, the D18S59 marker also showed evidence for excess allele sharing when analyzed with SPLINK (P=0.016), ANALYZE (Sibpair) (P=0.0095), and with MAPMAKER SIBS (P = 0.014). The adenylate cyclase activating polypeptide 1 gene (ADCYAP1) is involved in vasodilation and has been co-localized to the D18S59 marker. Results testing a microsatellite marker in the 3′ untranslated region of ADCYAP1 in age and gender matched HT and normotensive (NT) individuals showed possible association with hypertension (P = 0.038; Monte Carlo P = 0.02), but not with obesity. The present study shows a chromosome 18 role in essential hypertension and indicates that the genomic region near the ADCYAP1 gene or perhaps the gene itself may be implicated. Further investigation is required to conclusively determine the extent to which ADCYAP1 polymorphisms are involved in essential hypertension. © 2003 Wiley-Liss, Inc.
Resumo:
Background Single nucleotide polymorphisms (SNPs) rs429358 (ε4) and rs7412 (ε2), both invoking changes in the amino-acid sequence of the apolipoprotein E (APOE) gene, have previously been tested for association with multiple sclerosis (MS) risk. However, none of these studies was sufficiently powered to detect modest effect sizes at acceptable type-I error rates. As both SNPs are only imperfectly captured on commonly used microarray genotyping platforms, their evaluation in the context of genome-wide association studies has been hindered until recently. Methods We genotyped 12 740 subjects hitherto not studied for their APOE status, imputed raw genotype data from 8739 subjects from five independent genome wide association studies datasets using the most recent high-resolution reference panels, and extracted genotype data for 8265 subjects from previous candidate gene assessments. Results Despite sufficient power to detect associations at genome-wide significance thresholds across a range of ORs, our analyses did not support a role of rs429358 or rs7412 on MS susceptibility. This included meta-analyses of the combined data across 13 913 MS cases and 15 831 controls (OR=0.95, p=0.259, and OR 1.07, p=0.0569, for rs429358 and rs7412, respectively). Conclusion Given the large sample size of our analyses, it is unlikely that the two APOE missense SNPs studied here exert any relevant effects on MS susceptibility.
Resumo:
Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.
Resumo:
Background Despite the increasing recognition that medical training tends to coincide with markedly high levels of stress and distress, there is a dearth of validated measures that are capable of gauging the prevalence of depressive symptoms among medical residents in the Arab/Islamic part of the world. Objective The aim of the present study is two-fold. First is to examine the diagnostic validity of the Patient Health Questionnaire (PHQ-9) using an Omani medical resident population in order to establish a cut-off point. Second is to compare gender, age, and residency level among Omani Medical residents who report current depressive symptomatology versus those who report as non-depressed according to PHQ-9 cut-off threshold. Results A total of 132 residents (42 males and 90 females) consented to participate in this study. The cut-off score of 12 on the PHQ-9 revealed a sensitivity of 80.6% and a specificity of 94.0%. The rate of depression, as elicited by PHQ-9, was 11.4%. The role of gender, age, and residency level was not significant in endorsing depression. Conclusion This study indicated that PHQ-9 is a reliable measure among this cross-cultural population. More studies employing robust methodology are needed to confirm this finding.
Resumo:
Fundamental misconceptions regarding some basic phylogenetic terminology are presented in this opinion piece. An attempt is made to point out why these misconceptions exist and what may be causing the misapplication of terminology. Clarification is providing via basic definitions and simple explanations. Differences between the scientific fields of genetics and population genetics are discussed. The appropriate use of terminology is advocated and alternative terms are proposed to eliminate one potential source of confusion. It is suggested we use 'sequence data' instead of molecular data and 'non-sequence data' instead of morphological data in the field of phylogenetics and systematics.
Resumo:
This is the first report of an antibody-fusion protein expressed in transgenic plants for direct use in a medical diagnostic assay. By the use of gene constructs with appropriate promoters, high level expression of an anti-glycophorin single-chain antibody fused to an epitope of the HIV virus was obtained in the leaves and stems of tobacco, tubers of potato and seed of barley. This fusion protein replaces the SimpliRED™ diagnostic reagent, used for detecting the presence of HIV-1 antibodies in human blood. The reagent is expensive and laborious to produce by conventional means since chemical modifications to a monoclonal antibody are required. The plant-produced fusion protein was fully functional (by ELISA) in crude extracts and, for tobacco at least, could be used without further purification in the HIV agglutination assay. All three crop species produced sufficient reagent levels to be superior bioreactors to bacteria or mice, however barley grain was the most attractive bioreactor as it expressed the highest level (150 μg of reagent g-1), is inexpensive to produce and harvest, poses a minuscule gene flow problem in the field, and the activity of the reagent is largely undiminished in stored grain. This work suggests that barley seed will be an ideal factory for the production of antibodies, diagnostic immunoreagents, vaccines and other pharmaceutical proteins.