921 resultados para Clinical Classification
Resumo:
Climate change contributes directly or indirectly to changes in species distributions, and there is very high confidence that recent climate warming is already affecting ecosystems. The Arctic has already experienced the greatest regional warming in recent decades, and the trend is continuing. However, studies on the northern ecosystems are scarce compared to more southerly regions. Better understanding of the past and present environmental change is needed to be able to forecast the future. Multivariate methods were used to explore the distributional patterns of chironomids in 50 shallow (≤ 10m) lakes in relation to 24 variables determined in northern Fennoscandia at the ecotonal area from the boreal forest in the south to the orohemiarctic zone in the north. Highest taxon richness was noted at middle elevations around 400 m a.s.l. Significantly lower values were observed from cold lakes situated in the tundra zone. Lake water alkalinity had the strongest positive correlation with the taxon richness. Many taxa had preference for lakes either on tundra area or forested area. The variation in the chironomid abundance data was best correlated with sediment organic content (LOI), lake water total organic carbon content, pH and air temperature, with LOI being the strongest variable. Three major lake groups were separated on the basis of their chironomid assemblages: (i) small and shallow organic-rich lakes, (ii) large and base-rich lakes, and (iii) cold and clear oligotrophic tundra lakes. Environmental variables best discriminating the lake groups were LOI, taxon richness, and Mg. When repeated, this kind of an approach could be useful and efficient in monitoring the effects of global change on species ranges. Many species of fast spreading insects, including chironomids, show a remarkable ability to track environmental changes. Based on this ability, past environmental conditions have been reconstructed using their chitinous remains in the lake sediment profiles. In order to study the Holocene environmental history of subarctic aquatic systems, and quantitatively reconstruct the past temperatures at or near the treeline, long sediment cores covering the last 10000 years (the Holocene) were collected from three lakes. Lower temperature values than expected based on the presence of pine in the catchment during the mid-Holocene were reconstructed from a lake with great water volume and depth. The lake provided thermal refuge for profundal, cold adapted taxa during the warm period. In a shallow lake, the decrease in the reconstructed temperatures during the late Holocene may reflect the indirect response of the midges to climate change through, e.g., pH change. The results from three lakes indicated that the response of chironomids to climate have been more or less indirect. However, concurrent shifts in assemblages of chironomids and vegetation in two lakes during the Holocene time period indicated that the midges together with the terrestrial vegetation had responded to the same ultimate cause, which most likely was the Holocene climate change. This was also supported by the similarity in the long-term trends in faunal succession for the chironomid assemblages in several lakes in the area. In northern Finnish Lapland the distribution of chironomids were significantly correlated with physical and limnological factors that are most likely to change as a result of future climate change. The indirect and individualistic response of aquatic systems, as reconstructed using the chironomid assemblages, to the climate change in the past suggests that in the future, the lake ecosystems in the north do not respond in one predictable way to the global climate change. Lakes in the north may respond to global climate change in various ways that are dependent on the initial characters of the catchment area and the lake.
Resumo:
Undergraduate Medical Imaging (MI)students at QUT attend their first clinical placement towards the end of semester two. Students undertake two (pre)clinical skills development units – one theory and one practical. Students gain good contextual and theoretical knowledge during these units via a blended learning model with multiple learning methods employed. Students attend theory lectures, practical sessions, tutorial sessions in both a simulated and virtual environment and also attend pre-clinical scenario based tutorial sessions. The aim of this project is to evaluate the use of blended learning in the context of 1st year Medical Imaging Radiographic Technique and its effectiveness in preparing students for their first clinical experience. It is hoped that the multiple teaching methods employed within the pre-clinical training unit at QUT builds students clinical skills prior to the real situation. A quantitative approach will be taken, evaluating via pre and post clinical placement surveys. This data will be correlated with data gained in the previous year on the effectiveness of this training approach prior to clinical placement. In 2014 59 students were surveyed prior to their clinical placement demonstrated positive benefits of using a variety of learning tools to enhance their learning. 98.31%(n=58)of students agreed or strongly agreed that the theory lectures were a useful tool to enhance their learning. This was followed closely by 97% (n=57) of the students realising the value of performing role-play simulation prior to clinical placement. Tutorial engagement was considered useful for 93.22% (n=55) whilst 88.14% (n=52) reasoned that the x-raying of phantoms in the simulated radiographic laboratory was beneficial. Self-directed learning yielded 86.44% (n=51). The virtual reality simulation software was valuable for 72.41% (n=42) of the students. Of the 4 students that disagreed or strongly disagreed with the usefulness of any tool they strongly agreed to the usefulness of a minimum of one other learning tool. The impact of the blended learning model to meet diverse student needs continues to be positive with students engaging in most offerings. Students largely prefer pre -clinical scenario based practical and tutorial sessions where 'real-world’ situations are discussed.
Resumo:
Malaria causes a worldwide annual mortality of about a million people.Rapidly evolving drug-resistant species of the parasite have created a pressing need for the identification of new drug targets and vaccine candidates. By developing fractionation protocols to enrich parasites from low-parasitemia patient samples, we have carried out the first ever proteomics analysis of clinical isolates of early stages of Plasmodium falciparum (Pf) and P. vivax. Patient-derived malarial parasites were directly processed and analyzed using shotgun proteomics approach using high-sensitivity MS for protein identification. Our study revealed about 100 parasite-coded gene products that included many known drug targets such as Pf hypoxanthine guanine phosphoribosyl transferase, Pf L-lactate dehydrogenase, and Plasmepsins. In addition,our study reports the expression of several parasite proteins in clinical ring stages that have never been reported in the ring stages of the laboratory-cultivated parasite strain. This proof-of-principle study represents a noteworthy step forward in our understanding of pathways elaborated by the parasite within the malaria patient and will pave the way towards identification of new drug and vaccine targets that can aid malaria therapy.
Resumo:
Glaucoma, optic neuropathy with excavation in the optic nerve head and corresponding visual field defect, is one of the leading causes for blindness worldwide. However, visual disability can often be avoided or delayed if the disease is diagnosed at an early stage. Therefore, recognising the risk factors for development and progression of glaucoma may prevent further damage. The purpose of the present study was to evaluate factors associated with visual disability caused by glaucoma and the genetic features of two risk factors, exfoliation syndrome (ES) and a positive family history of glaucoma. The present study material consisted of three study groups 1) deceased glaucoma patients from the Ekenäs practice 2) glaucoma families from the Ekenäs region and 3) population based families with and without exfoliation syndrome from Kökar Island. For the retrospective study, 106 patients with open angle glaucoma (OAG) were identified. At the last visit, 17 patients were visually impaired. Blindness induced by glaucoma was found in one or both eyes in 16 patients and in both eyes in six patients. The cumulative incidence of glaucoma caused blindness for one eye was 6% at 5 years, 9% at 10 years, and 15% at 15 years from initialising the treatment. The factors associated with blindness caused by glaucoma were an advanced stage of glaucoma at diagnosis, fluctuation in intraocular pressure during treatment, the presence of exfoliation syndrome, and poor patient compliance. A cross-sectional population based study performed in 1960-1962 on Kökar Island and the same population was followed until 2002. In total 965 subjects (530 over 50 years) have been examined at least once. The prevalence of exfoliation syndrome (ES) was 18% among subjects older than 50 years. Seventy-five of all 78 ES-positives belonged to the same extended pedigree. According to the segregation and family analysis, exfoliation syndrome seemed to be inherited as an autosomal dominant trait with reduced penetrance. The penetrance was more reduced for males, but the risk for glaucoma was higher in males than in females. To find the gene or genes associated with exfoliation syndrome, a genome wide scan was performed for 64 members (28 ES affected and 36 controls) of the Kökar pedigree. A promising result was found: the highest two-point LOD score of 3.45 (θ=0.04) in chromosome18q12.1-21.33. The presence of mutations in glaucoma genes TIGR/MYOC (myocilin) and OPTN (optineurin) was analysed in eight glaucoma families from the Ekenäs region. An inheritance pattern resembling autosomal dominant mode was detected in all these families. Primary open angle glaucoma or exfoliation glaucoma was found in 35% of 136 family members and 28% were suspected to have glaucoma. No mutations were detected in these families.
Resumo:
With transplant rejection rendered a minor concern and survival rates after liver transplantation (LT) steadily improving, long-term complications are attracting more attention. Current immunosuppressive therapies, together with other factors, are accompanied by considerable long-term toxicity, which clinically manifests as renal dysfunction, high risk for cardiovascular disease, and cancer. This thesis investigates the incidence, causes, and risk factors for such renal dysfunction, cardiovascular risk, and cancer after LT. Long-term effects of LT are further addressed by surveying the quality of life and employment status of LT recipients. The consecutive patients included had undergone LT at Helsinki University Hospital from 1982 onwards. Data regarding renal function – creatinine and estimated glomerular filtration rate (GFR) – were recorded before and repeatedly after LT in 396 patients. The presence of hypertension, dyslipidemia, diabetes, impaired fasting glucose, and overweight/obesity before and 5 years after LT was determined among 77 patients transplanted for acute liver failure. The entire cohort of LT patients (540 patients), including both children and adults, was linked with the Finnish Cancer Registry, and numbers of cancers observed were compared to site-specific expected numbers based on national cancer incidence rates stratified by age, gender, and calendar time. Health-related quality of life (HRQoL), measured by the 15D instrument, and employment status were surveyed among all adult patients alive in 2007 (401 patients). The response rate was 89%. Posttransplant cardiovascular risk factor prevalence and HRQoL were compared with that in the age- and gender-matched Finnish general population. The cumulative risk for chronic kidney disease increased from 10% at 5 years to 16% at 10 years following LT. GFR up to 10 years after LT could be predicted by the GFR at 1 year. In patients transplanted for chronic liver disease, a moderate correlation of pretransplant GFR with later GFR was also evident, whereas in acute liver failure patients after LT, even severe pretransplant renal dysfunction often recovered. By 5 years after LT, 71% of acute liver failure patients were receiving antihypertensive medications, 61% were exhibiting dyslipidemia, 10% were diabetic, 32% were overweight, and 13% obese. Compared with the general population, only hypertension displayed a significantly elevated prevalence among patients – 2.7-fold – whereas patients exhibited 30% less dyslipidemia and 71% less impaired fasting glucose. The cumulative incidence of cancer was 5% at 5 years and 13% at 10. Compared with the general population, patients were subject to a 2.6-fold cancer risk, with non-melanoma skin cancer (standardized incidence ratio, SIR, 38.5) and non-Hodgkin lymphoma (SIR 13.9) being the predominant malignancies. Non-Hodgkin lymphoma was associated with male gender, young age, and the immediate posttransplant period, whereas old age and antibody induction therapy raised skin-cancer risk. HRQoL deviated clinically unimportantly from the values in the general population, but significant deficits among patients were evident in some physical domains. HRQoL did not seem to decrease with longer follow-up. Although 87% of patients reported improved working capacity, data on return to working life showed marked age-dependency: Among patients aged less than 40 at LT, 70 to 80% returned to work, among those aged 40 to 50, 55%, and among those above 50, 15% to 28%. The most common cause for unemployment was early retirement before LT. Those patients employed exhibited better HRQoL than those unemployed. In conclusion, although renal impairment, hypertension, and cancer are evidently common after LT and increase with time, patients’ quality of life remains comparable with that of the general population.
Resumo:
Background Diabetic foot complications are the leading cause of lower extremity amputation and diabetes-related hospitalisation in Australia. Studies demonstrate significant reductions in amputations and hospitalisation when health professionals implement best practice management. Whilst other nations have surveyed health professionals on specific diabetic foot management, to the best of the authors’ knowledge this appears not to have occurred in Australia. The primary aim of this study was to examine Australian podiatrists’ diabetic foot management compared with best practice recommendations by the Australian National Health Medical Research Council. Methods A 36-item Australian Diabetic Foot Management survey, employing seven-point Likert scales (0 = Never; 7 = Always) to measure multiple aspects of best practice diabetic foot management was developed. The survey was briefly tested for face and content validity. The survey was electronically distributed to Australian podiatrists via professional associations. Demographics including sex, years treating patients with diabetes, employment-sector and patient numbers were also collected. Chi-squared and Mann Whitney U tests were used to test differences between sub-groups. Results Three hundred and eleven podiatrists responded; 222 (71%) were female, 158 (51%) from the public sector and 11–15 years median experience. Participants reported treating a median of 21–30 diabetes patients each week, including 1–5 with foot ulcers. Overall, participants registered median scores of at least “very often” (>6) in their use of most items covering best practice diabetic foot management. Notable exceptions were: “never” (1 (1 – 3)) using total contact casting, “sometimes” (4 (2 – 5)) performing an ankle brachial index, “sometimes” (4 (1 – 6)) using University of Texas Wound Classification System, and “sometimes” (4 (3 – 6) referring to specialist multi-disciplinary foot teams. Public sector podiatrists reported higher use or access on all those items compared to private sector podiatrists (p < 0.01). Conclusions This study provides the first baseline information on Australian podiatrists’ adherence to best practice diabetic foot guidelines. It appears podiatrists manage large caseloads of people with diabetes and are generally implementing best practice guidelines recommendations with some notable exceptions. Further studies are required to identify barriers to implementing these recommendations to ensure all Australians with diabetes have access to best practice care to prevent amputations.
Resumo:
This study is one part of a collaborative depression research project, the Vantaa Depression Study (VDS), involving the Department of Mental and Alcohol Research of the National Public Health Institute, Helsinki, and the Department of Psychiatry of the Peijas Medical Care District (PMCD), Vantaa, Finland. The VDS includes two parts, a record-based study consisting of 803 patients, and a prospective, naturalistic cohort study of 269 patients. Both studies include secondary-level care psychiatric out- and inpatients with a new episode of major depressive disorder (MDD). Data for the record-based part of the study came from a computerised patient database incorporating all outpatient visits as well as treatment periods at the inpatient unit. We included all patients aged 20 to 59 years old who had been assigned a clinical diagnosis of depressive episode or recurrent depressive disorder according to the International Classification of Diseases, 10th edition (ICD-10) criteria and who had at least one outpatient visit or day as an inpatient in the PMCD during the study period January 1, 1996, to December 31, 1996. All those with an earlier diagnosis of schizophrenia, other non-affective psychosis, or bipolar disorder were excluded. Patients treated in the somatic departments of Peijas Hospital and those who had consulted but not received treatment from the psychiatric consultation services were excluded. The study sample comprised 290 male and 513 female patients. All their psychiatric records were reviewed and each patient completed a structured form with 57 items. The treatment provided was reviewed up to the end of the depression episode or to the end of 1997. Most (84%) of the patients received antidepressants, including a minority (11%) on treatment with clearly subtherapeutic low doses. During the treatment period the depressed patients investigated averaged only a few visits to psychiatrists (median two visits), but more to other health professionals (median seven). One-fifth of both genders were inpatients, with a mean of nearly two inpatient treatment periods during the overall treatment period investigated. The median length of a hospital stay was 2 weeks. Use of antidepressants was quite conservative: The first antidepressant had been switched to another compound in only about one-fifth (22%) of patients, and only two patients had received up to five antidepressant trials. Only 7% of those prescribed any antidepressant received two antidepressants simultaneously. None of the patients was prescribed any other augmentation medication. Refusing antidepressant treatment was the most common explanation for receiving no antidepressants. During the treatment period, 19% of those not already receiving a disability pension were granted one due to psychiatric illness. These patients were nearly nine years older than those not pensioned. They were also more severely ill, made significantly more visits to professionals and received significantly more concomitant medications (hypnotics, anxiolytics, and neuroleptics) than did those receiving no pension. In the prospective part of the VDS, 806 adult patients were screened (aged 20-59 years) in the PMCD for a possible new episode of DSM-IV MDD. Of these, 542 patients were interviewed face-to-face with the WHO Schedules for Clinical Assessment in Neuropsychiatry (SCAN), Version 2.0. Exclusion criteria were the same as in the record-based part of the VDS. Of these, 542 269 patients fulfiled the criteria of DSM-IV MDE. This study investigated factors associated with patients' functional disability, social adjustment, and work disability (being on sick-leave or being granted a disability pension). In the beginning of the treatment the most important single factor associated with overall social and functional disability was found to be severity of depression, but older age and personality disorders also significantly contributed. Total duration and severity of depression, phobic disorders, alcoholism, and personality disorders all independently contributed to poor social adjustment. Of those who were employed, almost half (43%) were on sick-leave. Besides severity and number of episodes of depression, female gender and age over 50 years strongly and independently predicted being on sick-leave. Factors influencing social and occupational disability and social adjustment among patients with MDD were studied prospectively during an 18-month follow-up period. Patients' functional disability and social adjustment were alleviated during the follow-up concurrently with recovery from depression. The current level of functioning and social adjustment of a patient with depression was predicted by severity of depression, recurrence before baseline and during follow-up, lack of full remission, and time spent depressed. Comorbid psychiatric disorders, personality traits (neuroticism), and perceived social support also had a significant influence. During the 18-month follow-up period, of the 269, 13 (5%) patients switched to bipolar disorder, and 58 (20%) dropped out. Of the 198, 186 (94%) patients were at baseline not pensioned, and they were investigated. Of them, 21 were granted a disability pension during the follow-up. Those who received a pension were significantly older, more seldom had vocational education, and were more often on sick-leave than those not pensioned, but did not differ with regard to any other sociodemographic or clinical factors. Patients with MDD received mostly adequate antidepressant treatment, but problems existed in treatment intensity and monitoring. It is challenging to find those at greatest risk for disability and to provide them adequate and efficacious treatment. This includes great challenges to the whole society to provide sufficient resources.
Resumo:
Measurement of fractional exhaled nitric oxide (FENO) has proven useful in assessment of patients with respiratory symptoms, especially in predicting steroid response. The objective of these studies was to clarify issues relevant for the clinical use of FENO. The influence of allergic sensitization per se on FENO in healthy asymptomatic subjects was studied, the association between airway inflammation and bronchial hyperresponsiveness (BHR) in steroid-naive subjects with symptoms suggesting asthma was examined, as well as the possible difference in this association between atopic and nonatopic subjects. Influence of smoking on FENO was compared between atopic and nonatopic steroid-naive asthmatics and healthy subjects. The short-term repeatability of FENO in COPD patients was examined in order to assess whether the degree of chronic obstruction influences the repeatability. For these purposes, we studied a random sample of 248 citizens of Helsinki, 227 army conscripts with current symptoms suggesting asthma, 19 COPD patients, and 39 healthy subjects. FENO measurement, spirometry and bronchodilatation test, structured interview. skin prick tests, and histamine and exercise challenges were performed. Among healthy subjects with no signs of airway diseases, median FENO was similar in skin prick test-positive and –negative subjects, and the upper normal limit of FENO was 30 ppb. In atopic and nonatopic subjects with symptoms suggesting asthma, FENO associated with severity of exercise- or histamine-induced BHR only in atopic patients. FENO in smokers with steroid-naive asthma was significantly higher than in healthy smokers and nonsmokers. Among atopic asthmatics, FENO was significantly lower in smokers than in nonsmokers, whereas no difference appeared among nonatopic asthmatics. The 24-h repeatability of FENO was equally good in COPD patients as in healthy subjects. These findings indicate that allergic sensitization per se does not influence FENO, supporting the view that elevated FENO indicates NO-producing airway inflammation, and that same reference range can be applied to both skin prick test-positive and -negative subjects. The significant correlation between FENO and degree of BHR only in atopic steroid-naive subjects with current asthmatic symptoms supports the view that pathogenesis of BHR in atopic asthma is strongly involved in NO-producing airway inflammation, whereas in development of BHR in nonatopic asthma other mechanisms may dominate. Attenuation of FENO only in atopic but not in nonatopic smokers with steroid-naive asthma may result from differences in mechanisms of FENO formation as well as in sensitivity of these mechanisms to smoking in atopic and nonatopic asthma. The results suggest, however, that in young adult smokers, FENO measurement may prove useful in assessment of airway inflammation. The short-term repeatability of FENO in COPD patients with moderate to very severe disease and in healthy subjects was equally good.
Resumo:
Background The purpose of this study was threefold. First, it was to determine the relationship between serum vitamin profiles and ischemic stroke. The second purpose was to investigate the association of methylenetetrahydrofolate reductase (MTHFR), endothelial nitric oxide synthase (eNOS), angiotensin converting enzyme (ACE), and apolipoprotein-E (ApoE) gene polymorphisms with ischemic stroke and further correlate with serum vitamin profiles among ischemic stroke patients. The third purpose of the study was to highlight the interaction of MTHFR and eNOS haplotypes with serum vitamin profiles and ischemic stroke risks. Methods Polymorphisms of these genes were analyzed in age-, sex-, and ethnicity-matched case–controls (n = 594); serum vitamin profiles were determined using immunoassays. Results The MTHFR 677C>T, 1298A>C, eNOS intron 4a/b, and ApoE polymorphisms were significantly associated with the increased risk of ischemic stroke. Elevated serum homocysteine and vitamin B12 levels were associated with MTHFR 677C>T and eNOS intron 4a/b polymorphisms. The ApoE and eNOS −786T>C polymorphisms were associated with increased serum vitamin B12 levels. However, none of the polymorphisms influenced serum folate levels except for the MTHFR 1298A>C. Different patterns of MTHFR and eNOS haplotypes tend to affect serum vitamin profiles to different degrees, which contribute to either different susceptibility risk or protective effect on ischemic stroke. Overall, increased levels of serum homocysteine and vitamin B12 levels were associated with higher risk of ischemic stroke in the investigated population. Conclusions The present study suggests that the genotypes and haplotypes of MTHFR 677C>T and eNOS intron 4a/b polymorphisms are potential serum biomarkers in the pathophysiological processes of ischemic stroke, by modulating homocysteine and vitamin B12 levels.
Resumo:
Malnutrition and poor nutritional intake have been identified as key issues associated with poorer clinical outcomes in chronic obstructive pulmonary disease (COPD) patients. There is strong evidence showing nutritional support is effective in treating malnutrition in stable COPD, but there is only limited research regarding nutritional status in patients treated with noninvasive ventilation (NIV). The impact of NIV during acute exacerbations of respiratory disease on nutritional status requires further investigation.
Resumo:
Partitional clustering algorithms, which partition the dataset into a pre-defined number of clusters, can be broadly classified into two types: algorithms which explicitly take the number of clusters as input and algorithms that take the expected size of a cluster as input. In this paper, we propose a variant of the k-means algorithm and prove that it is more efficient than standard k-means algorithms. An important contribution of this paper is the establishment of a relation between the number of clusters and the size of the clusters in a dataset through the analysis of our algorithm. We also demonstrate that the integration of this algorithm as a pre-processing step in classification algorithms reduces their running-time complexity.
Resumo:
Antibiotic resistance in 40 Staphylococcus aureus clinical isolates from 110 diabetic patients (36%) was evaluated. Of these, 32 (80%) of the isolates showed multidrug-resistance to more than eight antibiotics and 35% isolates were found to be methicillin resistant S. aureus (MRSA). All 40 S. aureus strains (100%) screened from diabetic clinical specimens were resistant to penicillin, 63% to ampicillin, 55% to streptomycin, 50% to tetracycline and 50% to gentamicin. Where as low resistance rate was observed to ciprofloxacin (20%) and rifampicin (8%). In contrast, all (100%) S. aureus strains recorded susceptibility to teicoplanin, which was followed by vancomycin (95%). Genotypical examination revealed that 80% of the aminoglycoside resistant S. aureus (ARSA) have aminoglycoside modifying enzyme (AME) coding genes; however, 20% of ARSA which showed non-AME mediated (adaptive) aminoglycoside resistance lacked these genes in their genome. In contrast all MRSA isolates possessed mecA, femA genetic determinants in their genome.