22 resultados para next 12 months

em Helda - Digital Repository of University of Helsinki


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Selenium (Se) has been demonstrated to be an essential trace element for maintenance of animal and human health. Although it has not been confirmed to be an essential micronutrient in higher plants, there is increasing evidence that Se functions as an antioxidant in plants. Selenium has been shown to exert a beneficial effect on crop growth and promotes stress tolerance at low concentrations. However, the specific physiological mechanisms that underlie the positive effects of Se in plants have not been clearly elucidated. The aims of this study were to determine the Se concentration in potato (Solanum tuberosum L.) and the effects of Se on the accumulation of carbohydrates, growth and yield in potato plants. An additional aim was to study the impact of Se on the total glycoalkaloid concentration in immature potato tubers. The distribution of Se in different biochemical Se fractions and the effect of storage on the Se concentration were studied in Se-enriched tubers. Furthermore, the effect of Se on raw darkening and translocation of Se from seed tubers to the next tuber generation was investigated. Due to the established anti-ageing properties of Se, it was of interest to study if Se affects physiological age and growth vigour of seed tubers. The Se concentrations in the upper leaves, roots, stolons and tubers of potato increased with increasing Se supplementation. The highest Se concentration was reached in young upper leaves, roots and stolons, indicating that added selenate was efficiently utilized and taken up at an early stage. During the growing period the Se concentration declined in the aerial parts, roots and stolons of potato plants whereas an intensive accumulation took place in immature and mature tubers. Selenium increased carbohydrate accumulation in the young upper leaves and in stolons, roots and tubers at maturity. This could not be explained by increased production of photoassimilates as net photosynthesis did not differ among Se treatments. The Se treated plants produced higher tuber yields than control plants, and at the highest Se concentration (0.3 mg kg-1) lower numbers of larger tubers were harvested. Increased yield of Se treated plants suggested that Se may enhance the allocation of photoassimilates for tuber growth, acting as a strong sink for both Se and for carbohydrates. Similarly as for other plant species, the positive impact of Se on the yield of potato plants could be related to its antioxidative effect in delaying senescence. The highest Se supplementation (0.9 mg kg-1) slightly decreased the glycoalkaloid concentration of immature tubers. However, at this level the Se concentration in tubers was about 20 µg g-1 DW. A 100 g consumption of potato would provide about 500 mg of Se, which exceeds the upper safe intake level of 400 µg per day for human dietary. The low Se applications (0.0035 and 0.1 mg kg-1) diminished and retarded the degree of raw darkening in tubers stored for one and eight months, which can be attributed to the antioxidative properties of Se. The storage for 1 to 12 months did not affect the Se concentrations of tubers. In the Se enriched tubers Se was allocated to the organic Se fraction, indicating that it was incorporated into organic compounds in tubers. Elevated Se concentration in the next-generation tubers produced by the Se enriched seed tubers indicated that Se could be translocated from the seed tubers to the progeny. In the seed tubers stored for 8 months, at high levels, Se had some positive effects on the growth vigour of sprouts, but Se had no consistent effect on the growth vigour of seed tubers of optimal physiological age. These results indicate that Se is a beneficial trace element in potato plants that exerts a positive effect on yield formation and improves the processing and storage quality of table potato tubers. These positive effects of Se are, however, dependent on the Se concentration and the age of the potato plant and tuber.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since 1997 the Finnish Jabal Haroun Project (FJHP) has studied the ruins of the monastery and pilgrimage complex (Gr. oikos) of Aaron located on a plateau of the Mountain of Prophet Aaron, Jabal an-Nabi Harûn, ca. 5 km to the south-west of the UNESCO World Heritage site of Petra in Jordan. The state of conservation and the damaging processes affecting the stone structures of the site are studied in this M.A. thesis. The chapel was chosen as an example, as it represents the phasing and building materials of the entire site. The aim of this work is to act as a preliminary study with regards to the planning of long-term conservation at the site. The research is empirical in nature. The condition of the stones in the chapel walls was mapped using the Illustrated Glossary on Stone Deterioration, by the ICOMOS International Scientific Committee for Stone. This glossary combines several standards and systems of damage mapping used in the field. Climatic conditions (temperature and RH %) were monitored for one year (9/2005-8/2006) using a HOBO Microstation datalogger. The measurements were compared with contemporary measurements from the nearest weather station in Wadi Musa. Salts in the stones were studied by taking samples from the stone surfaces by scraping and with the “Paper Pulp”-method; with a poultice of wet cellulose fiber (Arbocel BC1000) and analyzing what main types of salts were to be found in the samples. The climatic conditions on the mountain were expected to be rapidly changing and to differ clearly from conditions in the neighboring areas. The rapid changes were confirmed, but the values did not differ as much as expected from those nearby: the 12 months monitored had average temperatures and were somewhat drier than average. Earlier research in the area has shown that the geological properties of the stone material influence its deterioration. The damage mapping showed clearly, that salts are also a major reason for stone weathering. The salt samples contained several salt combinations, whose behavior in the extremely unstable climatic conditions is difficult to predict. Detailed mapping and regular monitoring of especially the structures, that are going remain exposed, is recommended in this work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Premature birth and associated small body size are known to affect health over the life course. Moreover, compelling evidence suggests that birth size throughout its whole range of variation is inversely associated with risk for cardiovascular disease and type 2 diabetes in subsequent life. To explain these findings, the Developmental Origins of Health and Disease (DOHaD) model has been introduced. Within this framework, restricted physical growth is, to a large extent, considered either a product of harmful environmental influences, such as suboptimal nutrition and alterations in the foetal hormonal milieu, or an adaptive reaction to the environment. Whether inverse associations exist between body size at birth and psychological vulnerability factors for mental disorders is poorly known. Thus, the aim of this thesis was to study in three large prospective cohorts whether prenatal and postnatal physical growth, across the whole range of variation, is associated with subsequent temperament/personality traits and psychological symptoms that are considered vulnerability factors for mental disorders. Weight and length at birth in full term infants showed quadratic associations with the temperamental trait of harm avoidance (Study I). The highest scores were characteristic of the smallest individuals, followed by the heaviest/longest. Linear associations between birth size and psychological outcomes were found such that lower weight and thinness at birth predicted more pronounced trait anxiety in late adulthood (Study II); lower birth weight, placental size, and head circumference at 12 months predicted a more pronounced positive schitzotypal trait in women (Study III); and thinness and smaller head circumference at birth associated with symptoms of attention-deficit hyperactivity disorder (ADHD) in children who were born at term (Study IV). These associations occured across the whole variation in birth size and after adjusting for several confounders. With respect to growth after birth, individuals with high trait anxiety scores in late adulthood were lighter in weight and thinner in infancy, and gained weight more rapidly between 7 and 11 years of age, but weighed less and were shorter in late adulthood in relation to weight and height measured at 11 years of age (Study II). These results suggest that a suboptimal prenatal environment reflected in smaller birth size may affect a variety of psychological vulnerability factors for mental disorders, such as the temperamental trait of harm avoidance, trait anxiety, schizotypal traits, and symptoms of ADHD. The smaller the birth size across the whole range of variation, the more pronounced were these psychological vulnerability factors. Moreover, some of these outcomes, such as trait anxiety, were also predicted by patterns of growth after birth. The findings are concordant with the DOHaD model, and emphasise the importance of prenatal factors in the aetiology of not only mental disorders but also their psychological vulnerability factors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Approximately one-third of stroke patients experience depression. Stroke also has a profound effect on the lives of caregivers of stroke survivors. However, depression in this latter population has received little attention. In this study the objectives were to determine which factors are associated with and can be used to predict depression at different points in time after stroke; to compare different depression assessment methods among stroke patients; and to determine the prevalence, course and associated factors of depression among the caregivers of stroke patients. A total of 100 consecutive hospital-admitted patients no older than 70 years of age were followed for 18 months after having their first ischaemic stroke. Depression was assessed according to the Diagnostic and Statistical Manual of Mental Disorders (DSM-III-R), Beck Depression Inventory (BDI), Hamilton Rating Scale (HRSD), Visual Analogue Mood Scale (VAMS), Clinical Global Impression (CGI) and caregiver ratings. Neurological assessments and a comprehensive neuropsychological test battery were performed. Depression in caregivers was assessed by BDI. Depressive symptoms had early onsets in most cases. Mild depressive symptoms were often persistent with little change during the 18-month follow-up, although there was an increase in major depression over the same time interval. Stroke severity was associated with depression especially from 6 to 12 months post-stroke. At the acute phase, older patients were at higher risk of depression, and a higher proportion of men were depressed at 18 months post-stroke. Of the various depression assessment methods, none stood clearly apart from the others. The feasibility of each did not differ greatly, but prevalence rates differed widely according to the different criteria. When compared against DSM-III-R criteria, sensitivity and specificity were acceptable for the CGI, BDI, and HRSD. The CGI and BDI had better sensitivity than the more specific HRSD. The VAMS seemed not to be a reliable method for assessing depression among stroke patients. The caregivers often rated patients depression as more severe than did the patients themselves. Moreover, their ratings seemed to be influenced by their own depression. Of the caregivers, 30-33% were depressed. At the acute phase, caregiver depression was associated with the severity of the stroke and the older age of the patient. The best predictor of caregiver depression at later follow-up was caregiver depression at the acute phase. The results suggest that depression should be assessed during the early post-stroke period and that the follow-up of those at risk of poor emotional outcome should be extended beyond the first year post-stroke. Further, the assessment of well-being of the caregivers of stroke patients should be included as a part of a rehabilitation plan for stroke patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of the present study was to determine relationships between insurance status and utilization of oral health care and its characteristics and to identify factors related to insured patients’ selection of dental clinic or dentist. The study was based on cross-sectional data obtained through phone interviews. The target population included adults in the city of Tehran. Using a two-stage stratified random technique, 3,200 seven-digit numbers resembling real phone numbers were drawn; when calling, 1,669 numbers were unavailable (busy, no answer, fax, line blocked). Of the 1,531 subjects who answered the phone call, 224 were outside the target age (under 18), and 221 refused to respond, leaving 1,086 subjects in the final sample. The interviews were carried out using a structured questionnaire and covered characteristics of dental visits, the respondent’s reason for selecting a particular dentist or clinic and demographic and socio-economic background (gender, age, level of education, income, and insurance status). Data analysis included the Chi-square test, ANOVA, and logistic regression and the corresponding odds ratios (OR). Of all the 1,086 respondents, 57% were women, 62% were under age 35, 46% had a medium and 34% a high level of education, 13% were under the poverty line, and 70% had insurance coverage; 64% with the public, and 6% with a commercial insurance. Having insurance coverage was more likely for women (OR=1.5), for those in the oldest age group (OR=2.0), and for those with a high level of education (OR=2.5). Of those with dental insurance, 54% reported having had a dental visit within the past 12 months ; more often by those with commercial insurance in comparison with public (65% vs. 53% p<0.001). Check-up as the reason for the most recent visit occurred most frequently among those with commercial insurance (28%) compared with those having public insurance (16%) or being non-insured (13%) (p<0.001). Having had two or more dental visits within the past 12 months was most common among insured respondents, when compared with the non-insured (31% vs. 22% p=0.01). The non-insured respondents reported tooth extractions almost twice as frequently as did the insured ones (p<0.001). Of the 726 insured subjects, 60% selected fully out-of-pocket-paid services (FOP), and 53% were unaware of their insurance benefits. Of those who selected FOP, good interpersonal aspects (OR=4.6), being unaware of dental insurance benefits (OR=4.6), and good technical aspects (OR=2.3) as a reason had greater odds of selecting FOP. The present study revealed that dental insurance was positively related to demand for oral health care as well as to utilization of services, but to the latter with a minor extent. Among insured respondents, despite their opportunity to use fully or highly subsidized oral health care services, good interpersonal relationship and high quality of services were the most important factors when an insured patient selected a dentist or a clinic. The present findings indicate a clear need to modify dental insurance systems in Iran to facilitate optimal use of oral health care services to maximize the oral health of the population. A special emphasis in the insurance schemes should be focused on preventive care.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Is oral health becoming a part of the global health culture? Oral health seems to turn out to be part of the global health culture, according to the findings of a thesis-research, Institute of Dentistry, University of Helsinki. The thesis is entitled as “Preadolescents and Their Mothers as Oral Health-Promoting Actors: Non-biologic Determinants of Oral Health among Turkish and Finnish Preadolescents.” The research was supervised by Prof.Murtomaa and led by Dr.A.Basak Cinar. It was conducted as a cross-sectional study of 611 Turkish and 223 Finnish school preadolescents in Istanbul and Helsinki, from the fourth, fifth, and sixth grades, aged 10 to 12, based on self-administered and pre-tested health behavior questionnaires for them and their mothers as well as the youth’s oral health records. Clinically assessed dental status (DMFT) and self-reported oral health of Turkish preadolescents was significantly poorer than the Finns`. A similar association occurred for well-being measures (height and weight, self-esteem), but not for school performance. Turkish preadolescents were more dentally anxious and reported lower mean values of toothbrushing self-efficacy and dietary self-efficacy than did Finns. The Turks less frequently reported recommended oral health behaviors (twice daily or more toothbrushing, sweet consumption on 2 days or less/week, decreased between-meal sweet consumption) than did the Finns. Turkish mothers reported less frequently dental health as being above average and recommended oral health behaviors as well as regular dental visits. Their mean values for dental anxiety was higher and self-efficacy on implementation of twice-daily toothbrushing were lower than those of the Finnish. Despite these differences between the Turks and Finns, the associations found in common for all preadolescents, regardless of cultural differences and different oral health care systems, assessed for the first time in a holistic framework, were as follows: There seems to be interrelation between oral health and general-well being (body height-weight measures, school performance, and self-esteem) among preadolescents: • The body height was an explanatory factor for dental health, underlining the possible common life-course factors for dental health and general well-being. • Better school performance, high levels of self-esteem and self-efficacy were interrelated and they contributed to good oral health. • Good school performance was a common predictor for twice-daily toothbrushing. Self-efficacy and maternal modelling have significant role for maintenance and improvement of both oral- and general health- related behaviors. In addition, there is need for integration of self-efficacy based approaches to promote better oral health. • All preadolescents with high levels of self-efficacy were more likely to report more frequent twice-daily toothbrushing and less frequent sweet consumption. • All preadolescents were likely to imitate toothbrushing and sweet consumption behaviors of their mothers. • High levels of self-efficacy contributed to low dental anxiety in various patterns in both groups. As a conclusion: • Many health-detrimental behaviors arise from the school age years and are unlikely to change later. Schools have powerful influences on children’s development and well-being. Therefore, oral health promotion in schools should be integrated into general health promotion, school curricula, and other activities. • Health promotion messages should be reinforced in schools, enabling children and their families to develop lifelong sustainable positive health-related skills (self-esteem, self-efficacy) and behaviors. • Placing more emphasis on behavioral sciences, preventive approaches, and community-based education during undergraduate studies should encourage social responsibility and health-promoting roles among dentists. Attempts to increase general well-being and to reduce oral health inequalities among preadolescents will remain unsuccessful if the individual factors, as well as maternal and societal influences, are not considered by psycho-social holistic approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims: The aims of this study were 1) to identify and describe health economic studies that have used quality-adjusted life years (QALYs) based on actual measurements of patients' health-related quality of life (HRQoL); 2) to test the feasibility of routine collection of health-related quality of life (HRQoL) data as an indicator of effectiveness of secondary health care; and 3) to establish and compare the cost-utility of three large-volume surgical procedures in a real-world setting in the Helsinki University Central Hospital, a large referral hospital providing secondary and tertiary health-care services for a population of approximately 1.4 million. Patients and methods: So as to identify studies that have used QALYs as an outcome measure, a systematic search of the literature was performed using the Medline, Embase, CINAHL, SCI and Cochrane Library electronic databases. Initial screening of the identified articles involved two reviewers independently reading the abstracts; the full-text articles were also evaluated independently by two reviewers, with a third reviewer used in cases where the two reviewers could not agree a consensus on which articles should be included. The feasibility of routinely evaluating the cost-effectiveness of secondary health care was tested by setting up a system for collecting HRQoL data on approximately 4 900 patients' HRQoL before and after operative treatments performed in the hospital. The HRQoL data used as an indicator of treatment effectiveness was combined with diagnostic and financial indicators routinely collected in the hospital. To compare the cost-effectiveness of three surgical interventions, 712 patients admitted for routine operative treatment completed the 15D HRQoL questionnaire before and also 3-12 months after the operation. QALYs were calculated using the obtained utility data and expected remaining life years of the patients. Direct hospital costs were obtained from the clinical patient administration database of the hospital and a cost-utility analysis was performed from the perspective of the provider of secondary health care services. Main results: The systematic review (Study I) showed that although QALYs gained are considered an important measure of the effectiveness of health care, the number of studies in which QALYs are based on actual measurements of patients' HRQoL is still fairly limited. Of the reviewed full-text articles, only 70 reported QALYs based on actual before after measurements using a valid HRQoL instrument. Collection of simple cost-effectiveness data in secondary health care is feasible and could easily be expanded and performed on a routine basis (Study II). It allows meaningful comparisons between various treatments and provides a means for allocating limited health care resources. The cost per QALY gained was 2 770 for cervical operations and 1 740 for lumbar operations. In cases where surgery was delayed the cost per QALY was doubled (Study III). The cost per QALY ranges between subgroups in cataract surgery (Study IV). The cost per QALY gained was 5 130 for patients having both eyes operated on and 8 210 for patients with only one eye operated on during the 6-month follow-up. In patients whose first eye had been operated on previous to the study period, the mean HRQoL deteriorated after surgery, thus precluding the establishment of the cost per QALY. In arthroplasty patients (Study V) the mean cost per QALY gained in a one-year period was 6 710 for primary hip replacement, 52 270 for revision hip replacement, and 14 000 for primary knee replacement. Conclusions: Although the importance of cost-utility analyses has during recent years been stressed, there are only a limited number of studies in which the evaluation is based on patients own assessment of the treatment effectiveness. Most of the cost-effectiveness and cost-utility analyses are based on modeling that employs expert opinion regarding the outcome of treatment, not on patient-derived assessments. Routine collection of effectiveness information from patients entering treatment in secondary health care turned out to be easy enough and did not, for instance, require additional personnel on the wards in which the study was executed. The mean patient response rate was more than 70 %, suggesting that patients were happy to participate and appreciated the fact that the hospital showed an interest in their well-being even after the actual treatment episode had ended. Spinal surgery leads to a statistically significant and clinically important improvement in HRQoL. The cost per QALY gained was reasonable, at less than half of that observed for instance for hip replacement surgery. However, prolonged waiting for an operation approximately doubled the cost per QALY gained from the surgical intervention. The mean utility gain following routine cataract surgery in a real world setting was relatively small and confined mostly to patients who had had both eyes operated on. The cost of cataract surgery per QALY gained was higher than previously reported and was associated with considerable degree of uncertainty. Hip and knee replacement both improve HRQoL. The cost per QALY gained from knee replacement is two-fold compared to hip replacement. Cost-utility results from the three studied specialties showed that there is great variation in the cost-utility of surgical interventions performed in a real-world setting even when only common, widely accepted interventions are considered. However, the cost per QALY of all the studied interventions, except for revision hip arthroplasty, was well below 50 000, this figure being sometimes cited in the literature as a threshold level for the cost-effectiveness of an intervention. Based on the present study it may be concluded that routine evaluation of the cost-utility of secondary health care is feasible and produces information essential for a rational and balanced allocation of scarce health care resources.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Infection by Epstein-Barr virus (EBV) occurs in approximately 95% of the world s population. EBV was the first human virus implicated in oncogenesis. Characteristic for EBV primary infection are detectable IgM and IgG antibodies against viral capsid antigen (VCA). During convalescence the VCA IgM disappears while the VCA IgG persists for life. Reactivations of EBV occur both among immunocompromised and immunocompetent individuals. In serological diagnosis, measurement of avidity of VCA IgG separates primary from secondary infections. However, in serodiagnosis of mononucleosis it is quite common to encounter, paradoxically, VCA IgM together with high-avidity VCA IgG, indicating past immunity. We determined the etiology of this phenomenon and found that, among patients with cytomegalovirus (CMV) primary infection a large proportion (23%) showed antibody profiles of EBV reactivation. In contrast, EBV primary infection did not appear to induce immunoreactivation of CMV. EBV-associated post-transplant lymphoproliferative disease (PTLD) is a life threatening complication of allogeneic stem cell or solid organ transplantation. PTLD may present with a diverse spectrum of clinical symptoms and signs. Due to rapidity of PTLD progression especially after stem cell transplantation, the diagnosis must be obtained quickly. Pending timely detection, the evolution of the fatal disease may be halted by reduction of immunosuppression. A promising new PTLD treatment (also in Finland) is based on anti-CD-20 monoclonal antibodies. Diagnosis of PTLD has been demanding because of immunosuppression, blood transfusions and the latent nature of the virus. We set up in 1999 to our knowledge first in Finland for any microbial pathogen a real-time quantitative PCR (qPCR) for detection of EBV DNA in blood serum/plasma. In addition, we set up an in situ hybridisation assay for EBV RNA in tissue sections. In collaboration with a group of haematologists at Helsinki University Central Hospital we retrospectively determined the incidence of PTLD among 257 allogenic stem cell transplantations (SCT) performed during 1994-1999. Post-mortem analysis revealed 18 cases of PTLD. From a subset of PTLD cases (12/18) and a series of corresponding controls (36), consecutive samples of serum were studied by the new EBV-qPCR. All the PTLD patients were positive for EBV-DNA with progressively rising copy numbers. In most PTLD patients EBV DNA became detectable within 70 days of SCT. Of note, the appearance of EBV DNA preceded the PTLD symptoms (fever, lymphadenopathy, atypical lymphocytes). Among the SCT controls, EBV DNA occurred only sporadically, and the EBV-DNA levels remained relatively low. We concluded that EBV qPCR is a highly sensitive (100%) and specific (96%) new diagnostic approach. We also looked for and found risk factors for the development of PTLD. Together with a liver transplantation group at the Transplantation and Liver Surgery Clinic we wanted to clarify how often and how severely do EBV infections occur after liver transplantation. We studied by the EBV qPCR 1284 plasma samples obtained from 105 adult liver transplant recipients. EBV DNA was detected in 14 patients (13%) during the first 12 months. The peak viral loads of 13 asymptomatic patients were relatively low (<6600/ml), and EBV DNA subsided quickly from circulation. Fatal PTLD was diagnosed in one patient. Finally, we wanted to determine the number and clinical significance of EBV infections of various types occurring among a large, retrospective, nonselected cohort of allogenic SCT recipients. We analysed by EBV qPCR 5479 serum samples of 406 SCT recipients obtained during 1988-1999. EBV DNA was seen in 57 (14%) patients, of whom 22 (5%) showed progressively rising and ultimately high levels of EBV DNA (median 54 million /ml). Among the SCT survivors, EBV DNA was transiently detectable in 19 (5%) asymptomatic patients. Thereby, low-level EBV-DNA positivity in serum occurs relatively often after SCT and may subside without specific treatment. However, high molecular copy numbers (>50 000) are diagnostic for life-threatening EBV infection. We furthermore developed a mathematical algorithm for the prediction of development of life-threatening EBV infection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sindbis virus (SINV) (genus Alphavirus, family Togaviridae) is an enveloped virus with a genome of single-stranded, positive-polarity RNA of 11.7 kilobases. SINV is widespread in Eurasia, Africa, and Australia, but clinical infection only occurs in a few geographically restricted areas, mainly in Northern Europe. In Europe, antibodies to SINV were detected from patients with fever, rash, and arthritis for the first time in the early 1980s in Finland. It became evident that the causative agent of this syndrome, named Pogosta disease, was closely related to SINV. The disease is also found in Sweden (Ockelbo disease) and in Russia (Karelian fever). Since 1974, for unknown reason, the disease has occurred as large outbreaks every seven years in Finland. This study is to a large degree based on the material collected during the 2002 Pogosta disease outbreak in Finland. We first developed SINV IgM and IgG enzyme immunoassays (EIA), based on highly purified SINV, to be used in serodiagnostics. The EIAs correlated well with the hemagglutination inhibition (HI) test, and all individuals showed neutralizing antibodies. The sensitivities of the IgM and IgG EIAs were 97.6% and 100%, and specificities 95.2% and 97.6%, respectively. E1 and E2 envelope glycoproteins of SINV were shown to be recognized by IgM and IgG in the immunoblot early in infection. We isolated SINV from five patients with acute Pogosta disease; one virus strain was recovered from whole blood, and four other strains from skin lesions. The etiology of Pogosta disease was confirmed by these first Finnish SINV strains, also representing the first human SINV isolates from Europe. Phylogenetic analysis indicated that the Finnish SINV strains clustered with the strains previously isolated from mosquitoes in Sweden and Russia, and seemed to have a common ancestor with South-African strains. Northern European SINV strains could be maintained locally in disease-endemic regions, but the phylogenetic analysis also suggests that redistribution of SINV tends to occur in a longitudinal direction, possibly with migratory birds. We searched for SINV antibodies in resident grouse (N=621), whose population crashes have previously coincided with human SINV outbreaks, and in migratory birds (N=836). SINV HI antibodies were found for the first time in birds during their spring migration to Northern Europe, from three individuals: red-backed shrike, robin, and song thrush. Of the grouse, 27.4% were seropositive in 2003, one year after a human outbreak, but only 1.4% of the grouse were seropositive in 2004. Thus, grouse might contribute to the human epidemiology of SINV. A total of 86 patients with verified SINV infection were recruited to the study in 2002. SINV RNA detection or virus isolation from blood and/or skin lesions was successful in eight patients. IgM antibodies became detectable within the first eight days of illness, and IgG within 11 days. The acute phase of Pogosta disease was characterized by arthritis, itching rash, fatigue, mild fever, headache, and muscle pain. Half of the patients reported in self-administered questionnaires joint symptoms to last > 12 months. Physical examination in 49 of these patients three years after infection revealed persistent joint manifestations. Arthritis (swelling and tenderness in physical examination) was diagnosed in 4.1% (2/49) of the patients. Tenderness in palpation or in movement of a joint was found in 14.3% of the patients in the rheumatologic examination, and additional 10.2% complained persisting arthralgia at the interview. Thus, 24.5% of the patients had joint manifestations attributable to the infection three years earlier. A positive IgM antibody response persisted in 3/49 of the patients; both two patients with arthritis were in this group. Persistent symptoms of SINV infection might have considerable public health implications in areas with high seroprevalence. The age-standardized seroprevalence of SINV (1999-2003, N=2529) in the human population in Finland was 5.2%. The seroprevalence was high in North Karelia, Kainuu, and Central Ostrobothnia. The incidence was highest in North Karelia. Seroprevalence in men (6.0%) was significantly higher than in women (4.1%), however, the average annualized incidence in the non-epidemic years was higher in women than in men, possibly indicating that infected men are more frequently asymptomatic. The seroprevalence increased with age, reaching 15.4% in persons aged 60-69 years. The incidence was highest in persons aged 50-59 years.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thirty percent of 70-year-old women have osteoporosis; after age of 80 its prevalence is up to 70%. Postmenopausal women with osteoporosis seem to be at an increased risk for cardiovascular events, and deterioration of oral health, as shown by attachment loss of teeth, which is proportional to the severity of osteoporosis. Osteoporosis can be treated with many different medication, e.g. estrogen and alendronate. We randomized 90 elderly osteoporotic women (65-80 years of age) to receive hormone therapy (HT)(2mg E2+NETA), 10mg alendronate, and their combination for two years and compared their effects on bone mineral density (BMD) and turnover, two surrogate markers of the risk of cardiovascular diseases, C-reactive protein (CRP) and E-selectin, as well as oral health. The effect of HT on health-related quality of life (HRQoL) was studied in the population-based cohort of 1663 postmenopausal women (mean age 68 yr) (585 estrogen users and 1078 non-users). BMD was measured with dual-energy X-ray absorptiometry (DXA) at 0, 12 and 24 months. Urinary N-telopeptide (NTX) of type I collagen, a marker of bone resorption, and serum aminoterminal propeptide of human type I procollagen (PINP), a marker of bone formation, were measured every six months of treatment. Serum CRP and E-selectin, were measured at 0, 6, and 12 months. Dental, and periodontal conditions, and gingival crevicular fluid (GCF) matrix metalloproteinase (MMP)-8 levels were studied to evaluate the oral health status and for the mouth symptoms a structured questionnaire was used. The HRQoL was measured with 15D questionnaire. Lumbar spine BMD increased similarly in all treatment groups (6.8-8.4% and 9.1-11.2%). Only HT increased femoral neck BMD at both 12 (4.9%) and 24 months (5.8%), at the latter time point the HT group differed significantly from the other groups. HT reduced bone marker levels of NTX and PINP significantly less than other two groups.Oral HT significantly increased serum CRP level by 76.5% at 6 and by 47.1% (NS) at 12 months, and decreased serum E-selectin level by 24.3% and 30.0%. Alendronate had no effect on these surrogate markers. Alendronate caused a decrease in the resting salivary flow rate and tended to increase GCF MMP-8 levels. Otherwise, there was no effect on the parameters of oral health. HT improved the HRQoL of elderly women significantly on the dimensions of usual activities, vitality and sexual activity, but the overall improvement in HRQoL was neither statistically significant nor clinically important. In conclusion, bisphosphonates might be the first option to start the treatment of postmenopausal osteoporosis in the old age.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this dissertation was to study the applicability of minced autologous fascia graft for injection laryngoplasty of unilateral vocal fold paralysis (UVFP). Permanence of augmentation and host versus graft tissue reactions were of special interest. The topic deals with phonosurgery, which is a subdivision of the Ear, Nose and Throat-speciality of medicine. UVFP results from an injury to the recurrent or the vagal nerve. The main symptom is a hoarse and weak voice. Surgery is warranted for patients in whom spontaneous reinnervation and a course of voice therapy fails to improve the voice. Injection laryngoplasty is a widespread surgical technique which aims to restore glottic closure by augmenting the atrophied vocal muscle, and also by turning the paralyzed vocal fold towards midline. Currently, there exists a great diversity of synthetic, xenologous, homologous, and autologous substances available for injection. An autologous graft is perfect in terms of biocompatibility. Free fascia grafts have been successfully used in the head and neck surgery for decades, but fascia had not been previously applied into the vocal fold. The fascia is harvested from the lateral thigh under local anesthesia and minced into paste by scissors. Injection of the vocal fold is performed in laryngomicroscopy under general anesthesia. Three series of clinical trials of injection laryngoplasty with autologous fascia (ILAF) for patients with UVFP were conducted at the Department of Otorhinolaryngology of the Helsinki University Central Hospital. The follow-up ranged from a few months to ten years. The aim was to document the vocal results and possible morbidity related to graft harvesting and vocal fold injection. To address the tissue reactions and the degree of reabsoprtion of the graft, an animal study with a follow-up ranging from 3 days to 12 months was performed at the National Laboratory Animal Center, University of Kuopio. Harvesting of the graft and injection was met with minor morbidity. Histological analysis of the vocal fold tissue showed that fascia was well tolerated. Although some resorption or compaction of the graft during the first months is evident, graft volume is maintained well. When injected deep and laterally into the vocalis muscle, the fascia graft allows normal vibration of the vocal fold mucosa to occur during phonation. Improvement of voice quality was seen in all series by multiple objective parameters of voice evaluation. However, the vocal results were poor in cases where the nerve trauma was severe, such as UVFP after chest surgery. ILAF is most suitable for correction of mild to moderate glottic gaps related to less severe nerve damage. Our results indicate that autologous fascia is a feasible and safe new injection material with good and stable vocal results. It offers a practical solution for surgeons who treat this complex issue.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: Patients with atopic dermatitis often have a poor long-term response to conventional topical or systemic treatments. Staphylococcal superinfections, skin atrophy due to corticosteroid use, and asthma and allergic rhinitis are common. Only a few, usually short-term, studies have addressed the effects of different treatments on these problems. Tacrolimus ointment is the first topical compound suitable for long-term treatment. The aim of this thesis was to evaluate the effects of long-term topical tacrolimus treatment on cutaneous staphylococcal colonization, collagen synthesis, and symptoms and signs of asthma and allergic rhinitis. Methods: Patients with moderate-to-severe atopic dermatitis were treated with intermittent 0.1% tacrolimus ointment in prospective, open studies lasting for 6 to 48 months. In Study I, cutaneous staphylococcal colonization was followed for 6 to 12 months. In Study II, skin thickness and collagen synthesis were followed by skin ultrasound and procollagen I and III propeptide concentrations of suction blister fluid samples for 12 to 24 months and compared with a group of corticosteroid-treated atopic dermatitis patients and with a group of healthy subjects. Study III was a cross-sectional study of the occurrence of respiratory symptoms, bronchial hyper-responsiveness, and sputum eosinophilia in atopic dermatitis patients and healthy controls. In Study V, the same parameters as in Study III were assessed in atopic dermatitis patients before and after 12 to 48 months of topical tacrolimus treatment. Study IV was a retrospective follow-up of the effect of tacrolimus 0.03% ointment on severe atopic blepharoconjunctivitis and conjunctival cytology. Results: The clinical response to topical tacrolimus was very good in all studies (p≤0.008). Staphylococcal colonization decreased significantly, and the effect was sustained throughout the study (p=0.01). Skin thickness (p<0.001) and markers of collagen synthesis (p<0.001) increased in the tacrolimus-treated patients significantly, whereas they decreased or remained unchanged in the corticosteroid-treated controls. Symptoms of asthma and allergic rhinitis (p<0.0001), bronchial hyper-responsiveness (p<0.0001), and sputum eosinophilia (p<0.0001) were significantly more common in patients with atopic dermatitis than in healthy controls, especially in subjects with positive skin prick tests or elevated serum immunoglobulin E. During topical tacrolimus treatment the asthma and rhinitis (p=0.005 and p=0.002) symptoms and bronchial hyper-responsiveness (p=0.02) decreased significantly, and serum immunoglobulin E and sputum eosinophils showed a decreasing trend in patients with the best treatment response. Treatment of atopic blepharoconjunctivitis resulted in a marked clinical response and a significant decrease in eosinophils, lymphocytes, and neutrophils in the conjunctival cytology samples. No significant adverse effects or increase in skin infections occurred in any study. Conclusions: The studies included in this thesis, except the study showing an increase in skin collagen synthesis in tacrolimus-treated patients, were uncontrolled, warranting certain reservations. The results suggest, however, that tacrolimus ointment has several beneficial effects in the long-term intermittent treatment of atopic dermatitis. Tacrolimus ointment efficiently suppresses the T cell-induced inflammation of atopic dermatitis. It has a normalizing effect on the function of the skin measured by the decrease in staphylococcal colonization. It does not cause skin atrophy as do corticosteroids but restores the skin collagen synthesis in patients who have used corticosteroids. Tacrolimus ointment has no marked systemic effect, as the absorption of the drug is minimal and decreases along with skin improvement. The effects on the airway: decrease in bronchial hyper-responsiveness and respiratory symptoms, can be speculated to be caused by the decrease in T cell trafficking from the skin to the respiratory tissues as the skin inflammation resolves, as well as inhibition of epicutaneous invasion of various antigens causing systemic sensitization when the skin barrier is disrupted as in atopic dermatitis. Patients with moderate-to-severe atopic dermatitis seem to benefit from efficient long-term treatment with topical tacrolimus.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The project consisted of two long-term follow-up studies of preterm children addressing the question whether intrauterine growth restriction affects the outcome. Assessment at 5 years of age of 203 children with a birth weight less than 1000 g born in Finland in 1996-1997 showed that 9% of the children had cognitive impairment, 14% cerebral palsy, and 4% needed a hearing aid. The intelligence quotient was lower (p<0.05) than the reference value. Thus, 20% exhibited major, 19% minor disabilities, and 61% had no functional abnormalities. Being small for gestational age (SGA) was associated with sub-optimal growth later. In children born before 27 gestational weeks, the SGA had more neuropsychological disabilities than those appropriate for gestational age (AGA). In another cohort with birth weight less than 1500 g assessed at 5 years of age, echocardiography showed a thickened interventricular septum and a decreased left ventricular end-diastolic diameter in both SGA and AGA born children. They also had a higher systolic blood pressure than the reference. Laser-Doppler flowmetry showed different endothelium-dependent and -independent vasodilation responses in the AGA children compared to those of the controls. SGA was not associated with cardio-vascular abnormalities. Auditory event-related potentials (AERPs) were recorded using an oddball paradigm with frequency deviants (standard tone 500 Hz and deviant 750-Hz with 10% probability). At term, the P350 was smaller in SGA and AGA infants than in controls. At 12 months, the automatic change detection peak (mismatch negativity, MMN) was observed in the controls. However, the pre-term infants had a difference positivity that correlated with their neurodevelopment scores. At 5 years of age, the P1-deflection, which reflects primary auditory processing, was smaller, and the MMN larger in the preterm than in the control children. Even with a challenging paradigm or a distraction paradigm, P1 was smaller in the preterm than in the control children. The SGA and AGA children showed similar AERP responses. Prematurity is a major risk factor for abnormal brain development. Preterm children showed signs of cardiovascular abnormality suggesting that prematurity per se may carry a risk for later morbidity. The small positive amplitudes in AERPs suggest persisting altered auditory processing in the preterm in-fants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pediatric renal transplantation (TX) has evolved greatly during the past few decades, and today TX is considered the standard care for children with end-stage renal disease. In Finland, 191 children had received renal transplants by October 2007, and 42% of them have already reached adulthood. Improvements in treatment of end-stage renal disease, surgical techniques, intensive care medicine, and in immunosuppressive therapy have paved the way to the current highly successful outcomes of pediatric transplantation. In children, the transplanted graft should last for decades, and normal growth and development should be guaranteed. These objectives set considerable requirements in optimizing and fine-tuning the post-operative therapy. Careful optimization of immunosuppressive therapy is crucial in protecting the graft against rejection, but also in protecting the patient against adverse effects of the medication. In the present study, the results of a retrospective investigation into individualized dosing of immunosuppresive medication, based on pharmacokinetic profiles, therapeutic drug monitoring, graft function and histology studies, and glucocorticoid biological activity determinations, are reported. Subgroups of a total of 178 patients, who received renal transplants in 1988 2006 were included in the study. The mean age at TX was 6.5 years, and approximately 26% of the patients were <2 years of age. The most common diagnosis leading to renal TX was congenital nephrosis of the Finnish type (NPHS1). Pediatric patients in Finland receive standard triple immunosuppression consisting of cyclosporine A (CsA), methylprednisolone (MP) and azathioprine (AZA) after renal TX. Optimal dosing of these agents is important to prevent rejections and preserve graft function in one hand, and to avoid the potentially serious adverse effects on the other hand. CsA has a narrow therapeutic window and individually variable pharmacokinetics. Therapeutic monitoring of CsA is, therefore, mandatory. Traditionally, CsA monitoring has been based on pre-dose trough levels (C0), but recent pharmacokinetic and clinical studies have revealed that the immunosuppressive effect may be related to diurnal CsA exposure and blood CsA concentration 0-4 hours after dosing. The two-hour post-dose concentration (C2) has proved a reliable surrogate marker of CsA exposure. Individual starting doses of CsA were analyzed in 65 patients. A recommended dose based on a pre-TX pharmacokinetic study was calculated for each patient by the pre-TX protocol. The predicted dose was clearly higher in the youngest children than in the older ones (22.9±10.4 and 10.5±5.1 mg/kg/d in patients <2 and >8 years of age, respectively). The actually administered oral doses of CsA were collected for three weeks after TX and compared to the pharmacokinetically predicted dose. After the TX, dosing of CsA was adjusted according to clinical parameters and blood CsA trough concentration. The pharmacokinetically predicted dose and patient age were the two significant parameters explaining post-TX doses of CsA. Accordingly, young children received significantly higher oral doses of CsA than the older ones. The correlation to the actually administered doses after TX was best in those patients, who had a predicted dose clearly higher or lower (> ±25%) than the average in their age-group. Due to the great individual variation in pharmacokinetics standardized dosing of CsA (based on body mass or surface area) may not be adequate. Pre-Tx profiles are helpful in determining suitable initial CsA doses. CsA monitoring based on trough and C2 concentrations was analyzed in 47 patients, who received renal transplants in 2001 2006. C0, C2 and experienced acute rejections were collected during the post-TX hospitalization, and also three months after TX when the first protocol core biopsy was obtained. The patients who remained rejection free had slightly higher C2 concentrations, especially very early after TX. However, after the first two weeks also the trough level was higher in the rejection-free patients than in those with acute rejections. Three months after TX the trough level was higher in patients with normal histology than in those with rejection changes in the routine biopsy. Monitoring of both the trough level and C2 may thus be warranted to guarantee sufficient peak concentration and baseline immunosuppression on one hand and to avoid over-exposure on the other hand. Controlling of rejection in the early months after transplantation is crucial as it may contribute to the development of long-term allograft nephropathy. Recently, it has become evident that immunoactivation fulfilling the histological criteria of acute rejection is possible in a well functioning graft with no clinical sings or laboratory perturbations. The influence of treatment of subclinical rejection, diagnosed in 3-month protocol biopsy, to graft function and histology 18 months after TX was analyzed in 22 patients and compared to 35 historical control patients. The incidence of subclinical rejection at three months was 43%, and the patients received a standard rejection treatment (a course of increased MP) and/or increased baseline immunosuppression, depending on the severity of rejection and graft function. Glomerular filtration rate (GFR) at 18 months was significantly better in the patients who were screened and treated for subclinical rejection in comparison to the historical patients (86.7±22.5 vs. 67.9±31.9 ml/min/1.73m2, respectively). The improvement was most remarkable in the youngest (<2 years) age group (94.1±11.0 vs. 67.9±26.8 ml/min/1.73m2). Histological findings of chronic allograft nephropathy were also more common in the historical patients in the 18-month protocol biopsy. All pediatric renal TX patients receive MP as a part of the baseline immunosuppression. Although the maintenance dose of MP is very low in the majority of the patients, the well-known steroid-related adverse affects are not uncommon. It has been shown in a previous study in Finnish pediatric TX patients that steroid exposure, measured as area under concentration-time curve (AUC), rather than the dose correlates with the adverse effects. In the present study, MP AUC was measured in sixteen stable maintenance patients, and a correlation with excess weight gain during 12 months after TX as well as with height deficit was found. A novel bioassay measuring the activation of glucocorticoid receptor dependent transcription cascade was also employed to assess the biological effect of MP. Glucocorticoid bioactivity was found to be related to the adverse effects, although the relationship was not as apparent as that with serum MP concentration. The findings in this study support individualized monitoring and adjustment of immunosuppression based on pharmacokinetics, graft function and histology. Pharmacokinetic profiles are helpful in estimating drug exposure and thus identifying the patients who might be at risk for excessive or insufficient immunosuppression. Individualized doses and monitoring of blood concentrations should definitely be employed with CsA, but possibly also with steroids. As an alternative to complete steroid withdrawal, individualized dosing based on drug exposure monitoring might help in avoiding the adverse effects. Early screening and treatment of subclinical immunoactivation is beneficial as it improves the prospects of good long-term graft function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. Kidney transplantation (KTX) is considered to be the best treatment of terminal uremia. Despite improvements in short-term graft survival, a considerable number of kidney allografts are lost due to the premature death of patients with a functional kidney and to chronic allograft nephropathy (CAN). Aim. To investigate the risk factors involved in the progression of CAN and to analyze diagnostic methods for this entity. Materials and methods. Altogether, 153 implant and 364 protocol biopsies obtained between June 1996 and April 2008 were analyzed. The biopsies were classified according to Banff ’97 and chronic allograft damage index (CADI). Immunohistochemistry for TGF-β1 was performed in 49 biopsies. Kidney function was evaluated by creatinine and/or cystatin C measurement and by various estimates of glomerular filtration rate (GFR). Demographic data of the donors and recipients were recorded after 2 years’ follow-up. Results. Most of the 3-month biopsies (73%) were nearly normal. The mean CADI score in the 6-month biopsies decreased significantly after 2001. Diastolic hypertension correlated with ΔCADI. Serum creatinine concentration at hospital discharge and glomerulosclerosis were risk factors for ΔCADI. High total and LDL cholesterol, low HDL and hypertension correlated with chronic histological changes. The mean age of the donors increased from 41 -52 years. Older donors were more often women who had died from an underlying disease. The prevalence of delayed graft function increased over the years, while acute rejections (AR) decreased significantly over the years. Sub-clinical AR was observed in 4% and it did not affect long-term allograft function or CADI. Recipients´ drug treatment was modified along the Studies, being mycophenolate mophetil, tacrolimus, statins and blockers of the renine-angiotensin-system more frequently prescribed after 2001. Patients with a higher ΔCADI had lower GFR during follow-up. CADI over 2 was best predicted by creatinine, although with modest sensitivity and specificity. Neither cystatin C nor other estimates of GFR were superior to creatinine for CADI prediction. Cyclosporine A toxicity was seldom seen. Low cyclosporin A concentration after 2 h correlated with TGF- β1 expression in interstitial inflammatory cells, and this predicted worse graft function. Conclusions. The progression of CAN has been affected by two major factors: the donors’ characteristics and the recipients’ hypertension. The increased prevalence of DGF might be a consequence of the acceptance of older donors who had died from an underlying disease. Implant biopsies proved to be of prognostic value, and they are essential for comparison with subsequent biopsies. The progression of histological damage was associated with hypertension and dyslipidemia. The augmented expression of TGF-β1 in inflammatory cells is unclear, but it may be related to low immunosuppression. Serum creatinine is the most suitable tool for monitoring kidney allograft function on every-day basis. However, protocol biopsies at 6 and 12 months predicted late kidney allograft dysfunction and affected the clinical management of the patients. Protocol biopsies are thus a suitable surrogate to be used in clinical trials and for monitoring kidney allografts.