974 resultados para Clinical Protocols
Resumo:
BACKGROUND The emergence of high levels of resistance in Cryptolestes ferrugineus (Stephens) in recent years threatens the sustainability of phosphine, a key fumigant used worldwide to disinfest stored grain. We aimed at developing robust fumigation protocols that could be used in a range of practical situations to control this resistant pest. RESULTS Values of the lethal time to kill 99.9% (LT99.9, in days) of mixed-age populations, containing all life stages, of a susceptible and a strongly resistant C. ferrugineus population were established at three phosphine concentrations (1.0, 1.5 and 2.0 mg L−1) and three temperatures (25, 30 and 35 °C). Multiple linear regression analysis revealed that phosphine concentration and temperature both contributed significantly to the LT99.9 of a population (P < 0.003, R2 = 0.92), with concentration being the dominant variable, accounting for 75.9% of the variation. Across all concentrations, LT99.9 of the strongly resistant C. ferrugineus population was longest at the lowest temperature and shortest at the highest temperature. For example, 1.0 mg L−1 of phosphine is required for 20, 15 and 15 days, 1.5 mg L−1 for 12, 11 and 9 days and 2.0 mg L−1 for 10, 7 and 6 days at 25, 30 and 35 °C, respectively, to achieve 99.9% mortality of the strongly resistant C. ferrugineus population. We also observed that phosphine concentration is inversely proportional to fumigation period in regard to the population extinction of this pest. CONCLUSION The fumigation protocols developed in this study will be used in recommending changes to the currently registered rates of phosphine in Australia towards management of strongly resistant C. ferrugineus populations, and can be repeated in any country where this type of resistance appears.
Resumo:
Poultry are considered a major source for campylobacteriosis in humans. A total of 1866 Campylobacter spp. isolates collected through the poultry processing chain were typed using flaA-restriction fragment length polymorphism to measure the impact of processing on the genotypes present. Temporally related human clinical isolates (n = 497) were also typed. Isolates were obtained from whole chicken carcass rinses of chickens collected before scalding, after scalding, before immersion chilling, after immersion chilling and after packaging as well as from individual caecal samples. A total of 32 genotypes comprising at least four isolates each were recognised. Simpson's Index of Diversity (D) was calculated for each sampling site within each flock, for each flock as a whole and for the clinical isolates. From caecal collection to after packaging samples the D value did not change in two flocks, decreased in one flock and increased in the fourth flock. Dominant genotypes occurred in each flock but their constitutive percentages changed through processing. There were 23 overlapping genotypes between clinical and chicken isolates. The diversity of Campylobacter is flock dependant and may alter through processing. This study confirms that poultry are a source of campylobacteriosis in the Australian population although other sources may contribute.
Resumo:
BACKGROUND The emergence of high levels of resistance in Cryptolestes ferrugineus (Stephens) in recent years threatens the sustainability of phosphine, a key fumigant used worldwide to disinfest stored grain. We aimed at developing robust fumigation protocols that could be used in a range of practical situations to control this resistant pest. RESULTS Values of the lethal time to kill 99.9% (LT99.9, in days) of mixed-age populations, containing all life stages, of a susceptible and a strongly resistant C. ferrugineus population were established at three phosphine concentrations (1.0, 1.5 and 2.0 mg L−1) and three temperatures (25, 30 and 35 °C). Multiple linear regression analysis revealed that phosphine concentration and temperature both contributed significantly to the LT99.9 of a population (P < 0.003, R2 = 0.92), with concentration being the dominant variable, accounting for 75.9% of the variation. Across all concentrations, LT99.9 of the strongly resistant C. ferrugineus population was longest at the lowest temperature and shortest at the highest temperature. For example, 1.0 mg L−1 of phosphine is required for 20, 15 and 15 days, 1.5 mg L−1 for 12, 11 and 9 days and 2.0 mg L−1 for 10, 7 and 6 days at 25, 30 and 35 °C, respectively, to achieve 99.9% mortality of the strongly resistant C. ferrugineus population. We also observed that phosphine concentration is inversely proportional to fumigation period in regard to the population extinction of this pest. CONCLUSION The fumigation protocols developed in this study will be used in recommending changes to the currently registered rates of phosphine in Australia towards management of strongly resistant C. ferrugineus populations, and can be repeated in any country where this type of resistance appears. © 2014 Commonwealth of Australia. Pest Management Science © 2014 Society of Chemical Industry
Resumo:
Objective To develop the DCDDaily, an instrument for objective and standardized clinical assessment of capacity in activities of daily living (ADL) in children with developmental coordination disorder (DCD), and to investigate its usability, reliability, and validity. Subjects Five to eight-year-old children with and without DCD. Main measures The DCDDaily was developed based on thorough review of the literature and extensive expert involvement. To investigate the usability (assessment time and feasibility), reliability (internal consistency and repeatability), and validity (concurrent and discriminant validity) of the DCDDaily, children were assessed with the DCDDaily and the Movement Assessment Battery for Children-2 Test, and their parents filled in the Movement Assessment Battery for Children-2 Checklist and Developmental Coordination Disorder Questionnaire. Results 459 children were assessed (DCD group, n = 55; normative reference group, n = 404). Assessment was possible within 30 minutes and in any clinical setting. For internal consistency, Cronbach’s α = 0.83. Intraclass correlation = 0.87 for test–retest reliability and 0.89 for inter-rater reliability. Concurrent correlations with Movement Assessment Battery for Children-2 Test and questionnaires were ρ = −0.494, 0.239, and −0.284, p < 0.001. Discriminant validity measures showed significantly worse performance in the DCD group than in the control group (mean (SD) score 33 (5.6) versus 26 (4.3), p < 0.001). The area under curve characteristic = 0.872, sensitivity and specificity were 80%. Conclusions The DCDDaily is a valid and reliable instrument for clinical assessment of capacity in ADL, that is feasible for use in clinical practice.
Resumo:
During the treatment of diabetic Charcot neuroarthropathy (CN) of the foot in two young patients, we discovered atypical alterations of their hands with loss of strength and paresthesia combined with atypical and nonhealing bone alterations and instability. Whereas CN of the foot is a serious and well-known complication of diabetes, CN of the hand is only mentioned in four articles (1–4).
Resumo:
This research investigated the efficacy of a post-discharge nurse-led clinic, for patients who underwent a cardiovascular interventional procedure in Australia. A randomised controlled clinical trial measured the effects of the clinic on patient confidence to self-manage and minimise psychological distress given the strong link between anxiety, depression and coronary heart disease. Hospitalisation for the procedure is short and stressful, and patients may wait up to 7-64 days for post-discharge review. This study provides preliminary quantitative and qualitative evidence that nurse-led clinics undertaken within the first week post-percutaneous coronary intervention may fill a much-needed gap for patients during a potentially vulnerable period.
Resumo:
Studies on 300 persons subjected by occupational hazard to the allergenic weed, Parthenium hysterophorus L. for periods ranging from 3 to 12 months revealed that 4% of them developed contact dermatitis of the exposed parts of the body, while 56% of them got sensitized to the weed without apparently exhibiting any dermatitis. None of them suffered from allergic manifestations like rhinitis or bronchial asthma during the period of study which extended for 2 years.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
DNA ja siinä sijaitsevat geenit ohjaavat kaikkea solujen toimintaa. DNA-molekyyleihin kuitenkin kertyy mutaatioita sekä ympäristön vaikutuksen, että solujen oman toiminnan tuloksena. Mikäli virheitä ei korjata, saattaa tuloksena olla solun muuttuminen syöpäsoluksi. Soluilla onkin käytössä useita DNA-virheiden korjausmekanismeja, joista yksi on ns. mismatch repair (MMR). MMR vastaa DNA:n kahdentumisessa syntyvien virheiden korjauksesta. Periytyvät mutaatiot geeneissä, jotka vastaavat MMR-proteiinien rakentamisesta, aiheuttavat ongelmia DNA:n korjauksessa ja altistavat kantajansa periytyvälle ei-polypoottiselle paksusuolisyöpäoireyhtymälle (hereditary nonpolyposis colorectal cancer, HNPCC). Yleisimmin mutatoituneet MMR-geenit ovat MLH1 ja MSH2. HNPCC periytyy vallitsevasti, eli jo toiselta vanhemmalta peritty geenivirhe altistaa syövälle. MMR-geenivirheen kantaja sairastuu syöpään elämänsä aikana suurella todennäköisyydellä, ja sairastumisikä on vain noin 40 vuotta. Syövälle altistavan geenivirheen löytäminen mutaation kantajilta on hyvin tärkeää, sillä säännöllinen seuranta mahdollistaa kehittymässä olevan kasvaimen havaitsemisen ja poistamisen jo aikaisessa vaiheessa. Tämän on osoitettu alentavan syöpäkuolleisuutta merkittävästi. Varma tieto altistuksen alkuperästä on tärkeä myös niille syöpäsuvun jäsenille, jotka eivät kanna kyseistä mutaatiota. Syövälle altistavien mutaatioiden ohella MMR-geeneistä löydetään säännöllisesti muutoksia, jotka ovat normaalia henkilöiden välistä geneettistä vaihtelua, eikä niiden oleteta lisäävän syöpäaltistusta. Altistavien mutaatioiden erottaminen näistä neutraaleista variaatioista on vaikeaa, mutta välttämätöntä altistuneiden tehokkaan seurannan varmistamiseksi. Tässä väitöskirjassa tutkittiin 18:a MSH2 -geenin mutaatiota. Mutaatiot oli löydetty perheistä, joissa esiintyi paljon syöpiä, mutta niiden vaikutus DNA:n korjaustehoon ja syöpäaltistukseen oli epäselvä. Työssä tutkittiin kunkin mutaation vaikutusta MSH2-proteiinin normaaliin toimintaan, ja tuloksia verrattiin potilaiden ja sukujen kliinisiin tietoihin. Tutkituista mutaatiosta 12 aiheutti puutteita MMR-korjauksessa. Nämä mutaatiot tulkittiin syövälle altistaviksi. Analyyseissä normaalisti toimineet 4 mutaatiota eivät todennäköisesti ole syynä syövän syntyyn kyseisillä perheillä. Tulkinta jätettiin avoimeksi 2 mutaation kohdalla. Tutkimuksesta hyötyivät suoraan kuvattujen mutaatioiden kantajaperheet, joiden geenivirheen syöpäaltistuksesta saatiin tietoa, mahdollistaen perinnöllisyysneuvonnan ja seurannan kohdentamisen sitä tarvitseville. Työ selvensi myös mekanismeja, joilla mutatoitunut MSH2-proteiini voi menettää toimintakykynsä.
Resumo:
The remarkable advances made in recombinant DNA technology over the last two decades have paved way for the use of gene transfer to treat human diseases. Several protocols have been developed for the introduction and expression of genes in humans, but the clinical efficacy has not been conclusively demonstrated in any of them. The eventual success of gene therapy for genetic and acquired disorders depends on the development of better gene transfer vectors for sustained, long term expression of foreign genes as well as a better understanding of the pathophysiology of human diseases, it is heartening to note that some of the gene therapy protocols have found other applications such as the genetic immunization or DNA vaccines, which is being heralded as the third vaccine revolution, Gene therapy is yet to become a dream come true, but the light is seen at the end of the tunnel.
Resumo:
Undergraduate Medical Imaging (MI)students at QUT attend their first clinical placement towards the end of semester two. Students undertake two (pre)clinical skills development units – one theory and one practical. Students gain good contextual and theoretical knowledge during these units via a blended learning model with multiple learning methods employed. Students attend theory lectures, practical sessions, tutorial sessions in both a simulated and virtual environment and also attend pre-clinical scenario based tutorial sessions. The aim of this project is to evaluate the use of blended learning in the context of 1st year Medical Imaging Radiographic Technique and its effectiveness in preparing students for their first clinical experience. It is hoped that the multiple teaching methods employed within the pre-clinical training unit at QUT builds students clinical skills prior to the real situation. A quantitative approach will be taken, evaluating via pre and post clinical placement surveys. This data will be correlated with data gained in the previous year on the effectiveness of this training approach prior to clinical placement. In 2014 59 students were surveyed prior to their clinical placement demonstrated positive benefits of using a variety of learning tools to enhance their learning. 98.31%(n=58)of students agreed or strongly agreed that the theory lectures were a useful tool to enhance their learning. This was followed closely by 97% (n=57) of the students realising the value of performing role-play simulation prior to clinical placement. Tutorial engagement was considered useful for 93.22% (n=55) whilst 88.14% (n=52) reasoned that the x-raying of phantoms in the simulated radiographic laboratory was beneficial. Self-directed learning yielded 86.44% (n=51). The virtual reality simulation software was valuable for 72.41% (n=42) of the students. Of the 4 students that disagreed or strongly disagreed with the usefulness of any tool they strongly agreed to the usefulness of a minimum of one other learning tool. The impact of the blended learning model to meet diverse student needs continues to be positive with students engaging in most offerings. Students largely prefer pre -clinical scenario based practical and tutorial sessions where 'real-world’ situations are discussed.
Resumo:
Glaucoma, optic neuropathy with excavation in the optic nerve head and corresponding visual field defect, is one of the leading causes for blindness worldwide. However, visual disability can often be avoided or delayed if the disease is diagnosed at an early stage. Therefore, recognising the risk factors for development and progression of glaucoma may prevent further damage. The purpose of the present study was to evaluate factors associated with visual disability caused by glaucoma and the genetic features of two risk factors, exfoliation syndrome (ES) and a positive family history of glaucoma. The present study material consisted of three study groups 1) deceased glaucoma patients from the Ekenäs practice 2) glaucoma families from the Ekenäs region and 3) population based families with and without exfoliation syndrome from Kökar Island. For the retrospective study, 106 patients with open angle glaucoma (OAG) were identified. At the last visit, 17 patients were visually impaired. Blindness induced by glaucoma was found in one or both eyes in 16 patients and in both eyes in six patients. The cumulative incidence of glaucoma caused blindness for one eye was 6% at 5 years, 9% at 10 years, and 15% at 15 years from initialising the treatment. The factors associated with blindness caused by glaucoma were an advanced stage of glaucoma at diagnosis, fluctuation in intraocular pressure during treatment, the presence of exfoliation syndrome, and poor patient compliance. A cross-sectional population based study performed in 1960-1962 on Kökar Island and the same population was followed until 2002. In total 965 subjects (530 over 50 years) have been examined at least once. The prevalence of exfoliation syndrome (ES) was 18% among subjects older than 50 years. Seventy-five of all 78 ES-positives belonged to the same extended pedigree. According to the segregation and family analysis, exfoliation syndrome seemed to be inherited as an autosomal dominant trait with reduced penetrance. The penetrance was more reduced for males, but the risk for glaucoma was higher in males than in females. To find the gene or genes associated with exfoliation syndrome, a genome wide scan was performed for 64 members (28 ES affected and 36 controls) of the Kökar pedigree. A promising result was found: the highest two-point LOD score of 3.45 (θ=0.04) in chromosome18q12.1-21.33. The presence of mutations in glaucoma genes TIGR/MYOC (myocilin) and OPTN (optineurin) was analysed in eight glaucoma families from the Ekenäs region. An inheritance pattern resembling autosomal dominant mode was detected in all these families. Primary open angle glaucoma or exfoliation glaucoma was found in 35% of 136 family members and 28% were suspected to have glaucoma. No mutations were detected in these families.