42 resultados para Eye detection
Resumo:
Topic detection and tracking (TDT) is an area of information retrieval research the focus of which revolves around news events. The problems TDT deals with relate to segmenting news text into cohesive stories, detecting something new, previously unreported, tracking the development of a previously reported event, and grouping together news that discuss the same event. The performance of the traditional information retrieval techniques based on full-text similarity has remained inadequate for online production systems. It has been difficult to make the distinction between same and similar events. In this work, we explore ways of representing and comparing news documents in order to detect new events and track their development. First, however, we put forward a conceptual analysis of the notions of topic and event. The purpose is to clarify the terminology and align it with the process of news-making and the tradition of story-telling. Second, we present a framework for document similarity that is based on semantic classes, i.e., groups of words with similar meaning. We adopt people, organizations, and locations as semantic classes in addition to general terms. As each semantic class can be assigned its own similarity measure, document similarity can make use of ontologies, e.g., geographical taxonomies. The documents are compared class-wise, and the outcome is a weighted combination of class-wise similarities. Third, we incorporate temporal information into document similarity. We formalize the natural language temporal expressions occurring in the text, and use them to anchor the rest of the terms onto the time-line. Upon comparing documents for event-based similarity, we look not only at matching terms, but also how near their anchors are on the time-line. Fourth, we experiment with an adaptive variant of the semantic class similarity system. The news reflect changes in the real world, and in order to keep up, the system has to change its behavior based on the contents of the news stream. We put forward two strategies for rebuilding the topic representations and report experiment results. We run experiments with three annotated TDT corpora. The use of semantic classes increased the effectiveness of topic tracking by 10-30\% depending on the experimental setup. The gain in spotting new events remained lower, around 3-4\%. The anchoring the text to a time-line based on the temporal expressions gave a further 10\% increase the effectiveness of topic tracking. The gains in detecting new events, again, remained smaller. The adaptive systems did not improve the tracking results.
Resumo:
Rhizoctonia spp. are ubiquitous soil inhabiting fungi that enter into pathogenic or symbiotic associations with plants. In general Rhizoctonia spp. are regarded as plant pathogenic fungi and many cause root rot and other plant diseases which results in considerable economic losses both in agriculture and forestry. Many Rhizoctonia strains enter into symbiotic mycorrhizal associations with orchids and some hypovirulent strains are promising biocontrol candidates in preventing host plant infection by pathogenic Rhizoctonia strains. This work focuses on uni- and binucleate Rhizoctonia (respectively UNR and BNR) strains belonging to the teleomorphic genus Ceratobasidium, but multinucleate Rhizoctonia (MNR) belonging to teleomorphic genus Thanatephorus and ectomycorrhizal fungal species, such as Suillus bovinus, were also included in DNA probe development work. Strain specific probes were developed to target rDNA ITS (internal transcribed spacer) sequences (ITS1, 5.8S and ITS2) and applied in Southern dot blot and liquid hybridization assays. Liquid hybridization was more sensitive and the size of the hybridized PCR products could be detected simultaneously, but the advantage in Southern hybridization was that sample DNA could be used without additional PCR amplification. The impacts of four Finnish BNR Ceratorhiza sp. strains 251, 266, 268 and 269 were investigated on Scot pine (Pinus sylvestris) seedling growth, and the infection biology and infection levels were microscopically examined following tryphan blue staining of infected roots. All BNR strains enhanced early seedling growth and affected the root architecture, while the infection levels remained low. The fungal infection was restricted to the outer cortical regions of long roots and typical monilioid cells detected with strain 268. The interactions of pathogenic UNR Ceratobasidium bicorne strain 1983-111/1N, and endophytic BNR Ceratorhiza sp. strain 268 were studied in single or dual inoculated Scots pine roots. The fungal infection levels and host defence-gene activity of nine transcripts [phenylalanine ammonia lyase (pal1), silbene synthase (STS), chalcone synthase (CHS), short-root specific peroxidase (Psyp1), antimicrobial peptide gene (Sp-AMP), rapidly elicited defence-related gene (PsACRE), germin-like protein (PsGER1), CuZn- superoxide dismutase (SOD), and dehydrin-like protein (dhy-like)] were measured from differentially treated and un-treated control roots by quantitative real time PCR (qRT-PCR). The infection level of pathogenic UNR was restricted in BNR- pre-inoculated Scots pine roots, while UNR was more competitive in simultaneous dual infection. The STS transcript was highly up-regulated in all treated roots, while CHS, pal1, and Psyp1 transcripts were more moderately activated. No significant activity of Sp-AMP, PsACRE, PsGER1, SOD, or dhy-like transcripts were detected compared to control roots. The integrated experiments presented, provide tools to assist in the future detection of these fungi in the environment and to understand the host infection biology and defence, and relationships between these interacting fungi in roots and soils. This study further confirms the complexity of the Rhizoctonia group both phylogenetically and in their infection biology and plant host specificity. The knowledge obtained could be applied in integrated forestry nursery management programmes.
Resumo:
Staphylococcus aureus is one of the most important bacteria that cause disease in humans, and methicillin-resistant S. aureus (MRSA) has become the most commonly identified antibiotic-resistant pathogen in many parts of the world. MRSA rates have been stable for many years in the Nordic countries and the Netherlands with a low MRSA prevalence in Europe, but in the recent decades, MRSA rates have increased in those low-prevalence countries as well. MRSA has been established as a major hospital pathogen, but has also been found increasingly in long-term facilities (LTF) and in communities of persons with no connections to the health-care setting. In Finland, the annual number of MRSA isolates reported to the National Infectious Disease Register (NIDR) has constantly increased, especially outside the Helsinki metropolitan area. Molecular typing has revealed numerous outbreak strains of MRSA, some of which have previously been associated with community acquisition. In this work, data on MRSA cases notified to the NIDR and on MRSA strain types identified with pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), and staphylococcal cassette chromosome mec (SCCmec) typing at the National Reference Laboratory (NRL) in Finland from 1997 to 2004 were analyzed. An increasing trend in MRSA incidence in Finland from 1997 to 2004 was shown. In addition, non-multi-drug resistant (NMDR) MRSA isolates, especially those resistant only to methicillin/oxacillin, showed an emerging trend. The predominant MRSA strains changed over time and place, but two internationally spread epidemic strains of MRSA, FIN-16 and FIN-21, were related to the increase detected most recently. Those strains were also one cause of the strikingly increasing invasive MRSA findings. The rise of MRSA strains with SCCmec types IV or V, possible community-acquired MRSA was also detected. With questionnaires, the diagnostic methods used for MRSA identification in Finnish microbiology laboratories and the number of MRSA screening specimens studied were reviewed. Surveys, which focused on the MRSA situation in long-term facilities in 2001 and on the background information of MRSA-positive persons in 2001-2003, were also carried out. The rates of MRSA and screening practices varied widely across geographic regions. Part of the NMDR MRSA strains could remain undetected in some laboratories because of insufficient diagnostic techniques used. The increasing proportion of elderly population carrying MRSA suggests that MRSA is an emerging problem in Finnish long-term facilities. Among the patients, 50% of the specimens were taken on a clinical basis, 43% on a screening basis after exposure to MRSA, 3% on a screening basis because of hospital contact abroad, and 4% for other reasons. In response to an outbreak of MRSA possessing a new genotype that occurred in a health care ward and in an associated nursing home of a small municipality in Northern Finland in autumn 2003, a point-prevalence survey was performed six months later. In the same study, the molecular epidemiology of MRSA and methicillin-sensitive S. aureus (MSSA) strains were also assessed, the results to the national strain collection compared, and the difficulties of MRSA screening with low-level oxacillin-resistant isolates encountered. The original MRSA outbreak in LTF, which consisted of isolates possessing a nationally new PFGE profile (FIN-22) and internationally rare MLST type (ST-27), was confined. Another previously unrecognized MRSA strain was found with additional screening, possibly indicating that current routine MRSA screening methods may be insufficiently sensitive for strains possessing low-level oxacillin resistance. Most of the MSSA strains found were genotypically related to the epidemic MRSA strains, but only a few of them had received the SCCmec element, and all those strains possessed the new SCCmec type V. In the second largest nursing home in Finland, the colonization of S. aureus and MRSA, and the role of screening sites along with broth enrichment culture on the sensitivity to detect S. aureus were studied. Combining the use of enrichment broth and perineal swabbing, in addition to nostrils and skin lesions swabbing, may be an alternative for throat swabs in the nursing home setting, especially when residents are uncooperative. Finally, in order to evaluate adequate phenotypic and genotypic methods needed for reliable laboratory diagnostics of MRSA, oxacillin disk diffusion and MIC tests to the cefoxitin disk diffusion method at both +35°C and +30°C, both with or without an addition of sodium chloride (NaCl) to the Müller Hinton test medium, and in-house PCR to two commercial molecular methods (the GenoType® MRSA test and the EVIGENETM MRSA Detection test) with different bacterial species in addition to S. aureus were compared. The cefoxitin disk diffusion method was superior to that of oxacillin disk diffusion and to the MIC tests in predicting mecA-mediated resistance in S. aureus when incubating at +35°C with or without the addition of NaCl to the test medium. Both the Geno Type® MRSA and EVIGENETM MRSA Detection tests are usable, accurate, cost-effective, and sufficiently fast methods for rapid MRSA confirmation from a pure culture.
Resumo:
Composting refers to aerobic degradation of organic material and is one of the main waste treatment methods used in Finland for treating separated organic waste. The composting process allows converting organic waste to a humus-like end product which can be used to increase the organic matter in agricultural soils, in gardening, or in landscaping. Microbes play a key role as degraders during the composting-process, and the microbiology of composting has been studied for decades, but there are still open questions regarding the microbiota in industrial composting processes. It is known that with the traditional, culturing-based methods only a small fraction, below 1%, of the species in a sample is normally detected. In recent years an immense diversity of bacteria, fungi and archaea has been found to occupy many different environments. Therefore the methods of characterising microbes constantly need to be developed further. In this thesis the presence of fungi and bacteria in full-scale and pilot-scale composting processes was characterised with cloning and sequencing. Several clone libraries were constructed and altogether nearly 6000 clones were sequenced. The microbial communities detected in this study were found to differ from the compost microbes observed in previous research with cultivation based methods or with molecular methods from processes of smaller scale, although there were similarities as well. The bacterial diversity was high. Based on the non-parametric coverage estimations, the number of bacterial operational taxonomic units (OTU) in certain stages of composting was over 500. Sequences similar to Lactobacillus and Acetobacteria were frequently detected in the early stages of drum composting. In tunnel stages of composting the bacterial community comprised of Bacillus, Thermoactinomyces, Actinobacteria and Lactobacillus. The fungal diversity was found to be high and phylotypes similar to yeasts were abundantly found in the full-scale drum and tunnel processes. In addition to phylotypes similar to Candida, Pichia and Geotrichum moulds from genus Thermomyces and Penicillium were observed in tunnel stages of composting. Zygomycetes were detected in the pilot-scale composting processes and in the compost piles. In some of the samples there were a few abundant phylotypes present in the clone libraries that masked the rare ones. The rare phylotypes were of interest and a method for collecting them from clone libraries for sequencing was developed. With negative selection of the abundant phylotyps the rare ones were picked from the clone libraries. Thus 41% of the clones in the studied clone libraries were sequenced. Since microbes play a central role in composting and in many other biotechnological processes, rapid methods for characterization of microbial diversity would be of value, both scientifically and commercially. Current methods, however, lack sensitivity and specificity and are therefore under development. Microarrays have been used in microbial ecology for a decade to study the presence or absence of certain microbes of interest in a multiplex manner. The sequence database collected in this thesis was used as basis for probe design and microarray development. The enzyme assisted detection method, ligation-detection-reaction (LDR) based microarray, was adapted for species-level detection of microbes characteristic of each stage of the composting process. With the use of a specially designed control probe it was established that a species specific probe can detect target DNA representing as little as 0.04% of total DNA in a sample. The developed microarray can be used to monitor composting processes or the hygienisation of the compost end product. A large compost microbe sequence dataset was collected and analysed in this thesis. The results provide valuable information on microbial community composition during industrial scale composting processes. The microarray method was developed based on the sequence database collected in this study. The method can be utilised in following the fate of interesting microbes during composting process in an extremely sensitive and specific manner. The platform for the microarray is universal and the method can easily be adapted for studying microbes from environments other than compost.
Resumo:
Inherited retinal diseases are the most common cause of vision loss among the working population in Western countries. It is estimated that ~1 of the people worldwide suffer from vision loss due to inherited retinal diseases. The severity of these diseases varies from partial vision loss to total blindness, and at the moment no effective cure exists. To date, nearly 200 mapped loci, including 140 cloned genes for inherited retinal diseases have been identified. By a rough estimation 50% of the retinal dystrophy genes still await discovery. In this thesis we aimed to study the genetic background of two inherited retinal diseases, X-linked cone-rod dystrophy and Åland Island eye disease. X-linked cone-rod dystrophy (CORDX) is characterized by progressive loss of visual function in school age or early adulthood. Affected males show reduced visual acuity, photophobia, myopia, color vision defects, central scotomas, and variable changes in fundus. The disease is genetically heterogeneous and two disease loci, CORDX1 and CORDX2, were known prior to the present thesis work. CORDX1, located on Xp21.1-11.4, is caused by mutations in the RPGR gene. CORDX2 is located on Xq27-28 but the causative gene is still unknown. Åland Island eye disease (AIED), originally described in a family living in Åland Islands, is a congenital retinal disease characterized by decreased visual acuity, fundus hypopigmentation, nystagmus, astigmatism, red color vision defect, myopia, and defective night vision. AIED shares similarities with another retinal disease, congenital stationary night blindness (CSNB2). Mutations in the L-type calcium channel α1F-subunit gene, CACNA1F, are known to cause CSNB2, as well as AIED-like disease. The disease locus of the original AIED family maps to the same genetic interval as the CACNA1F gene, but efforts to reveal CACNA1F mutations in patients of the original AIED family have been unsuccessful. The specific aims of this study were to map the disease gene in a large Finnish family with X-linked cone-rod dystrophy and to identify the disease-causing genes in the patients of the Finnish cone-rod dystrophy family and the original AIED family. With the help of linkage and haplotype analyses, we could localize the disease gene of the Finnish cone-rod dystrophy family to the Xp11.4-Xq13.1 region, and thus establish a new genetic X-linked cone-rod dystrophy locus, CORDX3. Mutation analyses of candidate genes revealed three novel CACNA1F gene mutations: IVS28-1 GCGTC>TGG in CORDX3 patients, a 425 bp deletion, comprising exon 30 and flanking intronic regions in AIED patients, and IVS16+2T>C in an additional Finnish patient with a CSNB2-like phenotype. All three novel mutations altered splice sites of the CACNA1F gene, and resulted in defective pre-mRNA splicing suggesting altered or absent channel function as a disease mechanism. The analyses of CACNA1F mRNA also revealed novel alternative wt splice variants, which may enhance channel diversity or regulate the overall expression level of the channel. The results of our studies may be utilized in genetic counseling of the families, and they provide a basis for studies on the pathogenesis of these diseases. In the future, the knowledge of the genetic defects may be used in the identification of specific therapies for the patients.
Resumo:
Infection is a major cause of mortality and morbidity after thoracic organ transplantation. The aim of the present study was to evaluate the infectious complications after lung and heart transplantation, with a special emphasis on the usefulness of bronchoscopy and the demonstration of cytomegalovirus (CMV), human herpes virus (HHV)-6, and HHV-7. We reviewed all the consecutive bronchoscopies performed on heart transplant recipients (HTRs) from May 1988 to December 2001 (n = 44) and lung transplant recipients (LTRs) from February 1994 to November 2002 (n = 472). To compare different assays in the detection of CMV, a total of 21 thoracic organ transplant recipients were prospectively monitored by CMV pp65-antigenemia, DNAemia (PCR), and mRNAemia (NASBA) tests. The antigenemia test was the reference assay for therapeutic intervention. In addition to CMV antigenemia, 22 LTRs were monitored for HHV-6 and HHV-7 antigenemia. The diagnostic yield of the clinically indicated bronchoscopies was 41 % in the HTRs and 61 % in the LTRs. The utility of the bronchoscopy was highest from one to six months after transplantation. In contrast, the findings from the surveillance bronchoscopies performed on LTRs led to a change in the previous treatment in only 6 % of the cases. Pneumocystis carinii and CMV were the most commonly detected pathogens. Furthermore, 15 (65 %) of the P. carinii infections in the LTRs were detected during chemoprophylaxis. None of the complications of the bronchoscopies were fatal. Antigenemia, DNAemia, and mRNAemia were present in 98 %, 72 %, and 43 % of the CMV infections, respectively. The optimal DNAemia cut-off levels (sensitivity/specificity) were 400 (75.9/92.7 %), 850 (91.3/91.3 %), and 1250 (100/91.5 %) copies/ml for the antigenemia of 2, 5, and 10 pp65-positive leukocytes/50 000 leukocytes, respectively. The sensitivities of the NASBA were 25.9, 43.5, and 56.3 % in detecting the same cut-off levels. CMV DNAemia was detected in 93 % and mRNAemia in 61 % of the CMV antigenemias requiring antiviral therapy. HHV-6, HHV-7, and CMV antigenemia was detected in 20 (91 %), 11 (50 %), and 12 (55 %) of the 22 LTRs (median 16, 31, and 165 days), respectively. HHV-6 appeared in 15 (79 %), HHV-7 in seven (37 %), and CMV in one (7 %) of these patients during ganciclovir or valganciclovir prophylaxis. One case of pneumonitis and another of encephalitis were associated with HHV-6. In conclusion, bronchoscopy is a safe and useful diagnostic tool in LTRs and HTRs with a suspected respiratory infection, but the role of surveillance bronchoscopy in LTRs remains controversial. The PCR assay acts comparably with the antigenemia test in guiding the pre-emptive therapy against CMV when threshold levels of over 5 pp65-antigen positive leukocytes are used. In contrast, the low sensitivity of NASBA limits its usefulness. HHV-6 and HHV-7 activation is common after lung transplantation despite ganciclovir or valganciclovir prophylaxis, but clinical manifestations are infrequently linked to them.
Resumo:
The purpose of this study was to estimate the prevalence and distribution of reduced visual acuity, major chronic eye diseases, and subsequent need for eye care services in the Finnish adult population comprising persons aged 30 years and older. In addition, we analyzed the effect of decreased vision on functioning and need for assistance using the World Health Organization’s (WHO) International Classification of Functioning, Disability, and Health (ICF) as a framework. The study was based on the Health 2000 health examination survey, a nationally representative population-based comprehensive survey of health and functional capacity carried out in 2000 to 2001 in Finland. The study sample representing the Finnish population aged 30 years and older was drawn by a two-stage stratified cluster sampling. The Health 2000 survey included a home interview and a comprehensive health examination conducted at a nearby screening center. If the invited participants did not attend, an abridged examination was conducted at home or in an institution. Based on our finding in participants, the great majority (96%) of Finnish adults had at least moderate visual acuity (VA ≥ 0.5) with current refraction correction, if any. However, in the age group 75–84 years the prevalence decreased to 81%, and after 85 years to 46%. In the population aged 30 years and older, the prevalence of habitual visual impairment (VA ≤ 0.25) was 1.6%, and 0.5% were blind (VA < 0.1). The prevalence of visual impairment increased significantly with age (p < 0.001), and after the age of 65 years the increase was sharp. Visual impairment was equally common for both sexes (OR 1.20, 95% CI 0.82 – 1.74). Based on self-reported and/or register-based data, the estimated total prevalences of cataract, glaucoma, age-related maculopathy (ARM), and diabetic retinopathy (DR) in the study population were 10%, 5%, 4%, and 1%, respectively. The prevalence of all of these chronic eye diseases increased with age (p < 0.001). Cataract and glaucoma were more common in women than in men (OR 1.55, 95% CI 1.26 – 1.91 and OR 1.57, 95% CI 1.24 – 1.98, respectively). The most prevalent eye diseases in people with visual impairment (VA ≤ 0.25) were ARM (37%), unoperated cataract (27%), glaucoma (22%), and DR (7%). One-half (58%) of visually impaired people had had a vision examination during the past five years, and 79% had received some vision rehabilitation services, mainly in the form of spectacles (70%). Only one-third (31%) had received formal low vision rehabilitation (i.e., fitting of low vision aids, receiving patient education, training for orientation and mobility, training for activities of daily living (ADL), or consultation with a social worker). People with low vision (VA 0.1 – 0.25) were less likely to have received formal low vision rehabilitation, magnifying glasses, or other low vision aids than blind people (VA < 0.1). Furthermore, low cognitive capacity and living in an institution were associated with limited use of vision rehabilitation services. Of the visually impaired living in the community, 71% reported a need for assistance and 24% had an unmet need for assistance in everyday activities. Prevalence of ADL, instrumental activities of daily living (IADL), and mobility increased with decreasing VA (p < 0.001). Visually impaired persons (VA ≤ 0.25) were four times more likely to have ADL disabilities than those with good VA (VA ≥ 0.8) after adjustment for sociodemographic and behavioral factors and chronic conditions (OR 4.36, 95% CI 2.44 – 7.78). Limitations in IADL and measured mobility were five times as likely (OR 4.82, 95% CI 2.38 – 9.76 and OR 5.37, 95% CI 2.44 – 7.78, respectively) and self-reported mobility limitations were three times as likely (OR 3.07, 95% CI 1.67 – 9.63) as in persons with good VA. The high prevalence of age-related eye diseases and subsequent visual impairment in the fastest growing segment of the population will result in a substantial increase in the demand for eye care services in the future. Many of the visually impaired, especially older persons with decreased cognitive capacity or living in an institution, have not had a recent vision examination and lack adequate low vision rehabilitation. This highlights the need for regular evaluation of visual function in the elderly and an active dissemination of information about rehabilitation services. Decreased VA is strongly associated with functional limitations, and even a slight decrease in VA was found to be associated with limited functioning. Thus, continuous efforts are needed to identify and treat eye diseases to maintain patients’ quality of life and to alleviate the social and economic burden of serious eye diseases.
Resumo:
This study aimed to investigate the morphology and function of corneal sensory nerves in 1) patients after corneal refractive surgery and 2) patients with dry eye due to Sjögren's syndrome. A third aim was to explore the possible correlation between cytokines detected in tears and development of post-PRK subepithelial haze. The main methods used were tear fluid ELISA analysis, corneal in vivo confocal microscopy, and noncontact esthesiometry. The results revealed that after PRK a positive correlation exists between the regeneration of subbasal nerves and the thickness of regenerated epithelium. Pre- or postoperative levels of the tear fluid cytokines TGF-β1, TNF-α, or PDGF-BB did not correlate with the development of corneal haze objectively estimated by in vivo confocal microscopy 3 months after PRK. After high myopic LASIK, a discrepancy between subjective dry eye symptoms and objective signs of dry eye was observed. The majority of patients reported ongoing dry eye symptoms even 5 years after LASIK, although no objective clinical signs of dry eye were apparent. In addition, no difference in corneal sensitivity was observed between these patients and controls. Primary Sjögren's syndrome patients presented with corneal hypersensitivity, although their corneal subbasal nerve density was normal. However, alterations in corneal nerve morphology (nerve sprouting and thickened stromal nerves) and an increased number of antigen-presenting cells among subbasal nerves were observed, implicating the presence of an ongoing inflammation. Based on these results, the relationship between nerve regeneration and epithelial thickness 3 months after PRK appears to reflect the trophic effect of corneal nerves on epithelium. In addition, measurement of tear fluid cytokines may not be suitable for screening patients for risk of scar (haze) formation after PRK. Presumably, at least part of the symptoms of "LASIK-associated dry eye" are derived from aberrantly regenerated and abnormally functioning corneal nerves. Thus, they may represent a form of corneal neuropathy or "phantom pain" rather than conventional dry eye. Corneal nerve alterations and inflammatory findings in Sjögren's syndrome offer an explanation for the corneal hypersensitivity or even chronic pain or hyperalgesia often observed in these patients. In severe cases of disabling chronic pain in patients with dry eye or after LASIK, when conventional therapeutic possibilities fail to offer relief, consultation of a physician specialized in pain treatment is recommended.
Resumo:
Drugs and surgical techniques may have harmful renal effects during the perioperative period. Traditional biomarkers are often insensitive to minor renal changes, but novel biomarkers may more accurately detect disturbances in glomerular and tubular function and integrity. The purpose of this study was first, to evaluate the renal effects of ketorolac and clonidine during inhalation anesthesia with sevoflurane and isoflurane, and secondly, to evaluate the effect of tobacco smoking on the production of inorganic fluoride (F-) following enflurane and sevoflurane anesthesia as well as to determine the effect of F- on renal function and cellular integrity in surgical patients. A total of 143 patients undergoing either conventional (n = 75) or endoscopic (n = 68) inpatient surgery were enrolled in four studies. The ketorolac and clonidine studies were prospective, randomized, placebo controlled and double-blinded, while the cigarette smoking studies were prospective cohort studies with two parallel groups. As a sign of proximal tubular deterioration, a similar transient increase in urine N-acetyl-beta-D-glucosaminidase/creatinine (U-NAG/crea) was noted in both the ketorolac group and in the controls (baseline vs. at two hours of anesthesia, p = 0.015) with a 3.3 minimum alveolar concentration hour sevoflurane anesthesia. Uncorrelated U-NAG increased above the maximum concentration measured from healthy volunteers (6.1 units/l) in 5/15 patients with ketorolac and in none of the controls (p = 0.042). As a sign of proximal tubular deterioration, U-glutathione transferase-alpha/crea (U-GST-alpha/crea) increased in both groups at two hours after anesthesia but a more significant increase was noted in the patients with ketorolac. U-GST-alpha/crea increased above the maximum ratio measured from healthy volunteers in 7/15 patients with ketorolac and in 3/15 controls. Clonidine diminished the activation of the renin-angiotensin aldosterone system during pneumoperitoneum; urine output was better preserved in the patients treated with clonidine (1/15 patients developed oliguria) than in the controls (8/15 developed oliguria (p=0.005)). Most patients with pneumoperitoneum and isoflurane anesthesia developed a transient proximal tubular deterioration, as U-NAG increased above 6.1 units/L in 11/15 patients with clonidine and in 7/15 controls. In the patients receiving clonidine treatment, the median of U-NAG/crea was higher than in the controls at 60 minutes of pneumoperitoneum (p = 0.01), suggesting that clonidine seems to worsen proximal tubular deterioration. Smoking induced the metabolism of enflurane, but the renal function remained intact in both the smokers and the non-smokers with enflurane anesthesia. On the contrary, smoking did not induce sevoflurane metabolism, but glomerular function decreased in 4/25 non-smokers and in 7/25 smokers with sevoflurane anesthesia. All five patients with S-F- ≥ 40 micromol/L, but only 6/45 with S-F- less than 40 micromol/L (p = 0.001), developed a S-tumor associated trypsin inhibitor concentration above 3 nmol/L as a sign of glomerular dysfunction. As a sign of proximal tubulus deterioration, U-beta 2-microglobulin increased in 2/5 patients with S-F- over 40 micromol/L compared to 2/45 patients with the highest S-F- less than 40 micromol/L (p = 0.005). To conclude, sevoflurane anesthesia may cause a transient proximal tubular deterioration which may be worsened by a co-administration of ketorolac. Clonidine premedication prevents the activation of the renin-angiotensin aldosterone system and preserves normal urine output, but may be harmful for proximal tubules during pneumoperitoneum. Smoking induces the metabolism of enflurane but not that of sevoflurane. Serum F- of 40 micromol/L or higher may induce glomerular dysfunction and proximal tubulus deterioration in patients with sevoflurane anesthesia. The novel renal biomarkers warrant further studies in order to establish reference values for surgical patients having inhalation anesthesia.
Resumo:
Background. Kidney transplantation (KTX) is considered to be the best treatment of terminal uremia. Despite improvements in short-term graft survival, a considerable number of kidney allografts are lost due to the premature death of patients with a functional kidney and to chronic allograft nephropathy (CAN). Aim. To investigate the risk factors involved in the progression of CAN and to analyze diagnostic methods for this entity. Materials and methods. Altogether, 153 implant and 364 protocol biopsies obtained between June 1996 and April 2008 were analyzed. The biopsies were classified according to Banff ’97 and chronic allograft damage index (CADI). Immunohistochemistry for TGF-β1 was performed in 49 biopsies. Kidney function was evaluated by creatinine and/or cystatin C measurement and by various estimates of glomerular filtration rate (GFR). Demographic data of the donors and recipients were recorded after 2 years’ follow-up. Results. Most of the 3-month biopsies (73%) were nearly normal. The mean CADI score in the 6-month biopsies decreased significantly after 2001. Diastolic hypertension correlated with ΔCADI. Serum creatinine concentration at hospital discharge and glomerulosclerosis were risk factors for ΔCADI. High total and LDL cholesterol, low HDL and hypertension correlated with chronic histological changes. The mean age of the donors increased from 41 -52 years. Older donors were more often women who had died from an underlying disease. The prevalence of delayed graft function increased over the years, while acute rejections (AR) decreased significantly over the years. Sub-clinical AR was observed in 4% and it did not affect long-term allograft function or CADI. Recipients´ drug treatment was modified along the Studies, being mycophenolate mophetil, tacrolimus, statins and blockers of the renine-angiotensin-system more frequently prescribed after 2001. Patients with a higher ΔCADI had lower GFR during follow-up. CADI over 2 was best predicted by creatinine, although with modest sensitivity and specificity. Neither cystatin C nor other estimates of GFR were superior to creatinine for CADI prediction. Cyclosporine A toxicity was seldom seen. Low cyclosporin A concentration after 2 h correlated with TGF- β1 expression in interstitial inflammatory cells, and this predicted worse graft function. Conclusions. The progression of CAN has been affected by two major factors: the donors’ characteristics and the recipients’ hypertension. The increased prevalence of DGF might be a consequence of the acceptance of older donors who had died from an underlying disease. Implant biopsies proved to be of prognostic value, and they are essential for comparison with subsequent biopsies. The progression of histological damage was associated with hypertension and dyslipidemia. The augmented expression of TGF-β1 in inflammatory cells is unclear, but it may be related to low immunosuppression. Serum creatinine is the most suitable tool for monitoring kidney allograft function on every-day basis. However, protocol biopsies at 6 and 12 months predicted late kidney allograft dysfunction and affected the clinical management of the patients. Protocol biopsies are thus a suitable surrogate to be used in clinical trials and for monitoring kidney allografts.
Resumo:
Purpose: The aim of the present study was to develop and test new digital imaging equipment and methods for diagnosis and follow-up of ocular diseases. Methods: The whole material comprised 398 subjects (469 examined eyes), including 241 patients with melanocytic choroidal tumours, 56 patients with melanocytic iris tumours, 42 patients with diabetes, a 52-year old patient with chronic phase of VKH disease, a 30-year old patient with an old blunt eye injury, and 57 normal healthy subjects. Digital 50° (Topcon TRC 50 IA) and 45° (Canon CR6-45NM) fundus cameras, a new handheld digital colour videocamera for eye examinations (MediTell), a new subtraction method using the Topcon Image Net Program (Topcon corporation, Tokyo, Japan), a new method for digital IRT imaging of the iris we developed, and Zeiss photoslitlamp with a digital camera body were used for digital imaging. Results: Digital 50° red-free imaging had a sensitivity of 97.7% and two-field 45° and 50° colour imaging a sensitivity of 88.9-94%. The specificity of the digital 45°-50° imaging modalities was 98.9-100% versus the reference standard and ungradeable images that were 1.2-1.6%. By using the handheld digital colour video camera only, the optic disc and central fundus located inside 20° from the fovea could be recorded with a sensitivity of 6.9% for detection of at least mild NPDR when compared with the reference standard. Comparative use of digital colour, red-free, and red light imaging showed 85.7% sensitivity, 99% specificity, and 98.2 % exact agreement versus the reference standard in differentiation of small choroidal melanoma from pseudomelanoma. The new subtraction method showed growth in four of 94 melanocytic tumours (4.3%) during a mean ±SD follow-up of 23 ± 11 months. The new digital IRT imaging of the iris showed the sphincter muscle and radial contraction folds of Schwalbe in the pupillary zone and radial structural folds of Schwalbe and circular contraction furrows in the ciliary zone of the iris. The 52-year-old patient with a chronic phase of VKH disease showed extensive atrophy and occasional pigment clumps in the iris stroma, detachment of the ciliary body with severe ocular hypotony, and shallow retinal detachment of the posterior pole in both eyes. Infrared transillumination imaging and fluorescein angiographic findings of the iris showed that IR translucence (p=0.53), complete masking of fluorescence (p=0.69), presence of disorganized vessels (p=0.32), and fluorescein leakage (p=1.0) at the site of the lesion did not differentiate an iris nevus from a melanoma. Conclusions: Digital 50° red-free and two-field 50° or 45° colour imaging were suitable for DR screening, whereas the handheld digital video camera did not fulfill the needs of DR screening. Comparative use of digital colour, red-free and red light imaging was a suitable method in the differentiation of small choroidal melanoma from different pseudomelanomas. The subtraction method may reveal early growth of the melanocytic choroidal tumours. Digital IRT imaging may be used to study changes of the stroma and posterior surface of the iris in various diseases of the uvea. It contributed to the revealment of iris atrophy and serous detachment of the ciliary body with ocular hypotony together with the shallow retinal detachment of the posterior pole as new findings of the chronic phase of VKH disease. Infrared translucence and angiographic findings are useful in differential diagnosis of melanocytic iris tumours, but they cannot be used to determine if the lesion is benign or malignant.