1000 resultados para 10201106 TM-67


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall objective of this study was to gain epidemiological knowledge about pain among employee populations. More specifically, the aims were to assess the prevalence of pain, to identify socio-economic risk groups and work-related psychosocial risk factors, and to assess the consequences in terms of health-related functioning and sickness absence. The study was carried out among the municipal employees of the City of Helsinki. Data comprised questionnaire survey conducted in years 2000-2002 and register data on sickness absence. Altogether 8960 40-60 year old employees participated to the survey (response rate 67%). Pain is common among ageing employees. Approximately 29 per cent of employees reported chronic pain and 15 per cent acute pain, and about seven per cent reported moderately or severely limiting disabling chronic pain. Pain was more common among those with lower level of education or in a low occupational class. -- Psychosocial work environment was associated with pain reports. Job strain, bullying at workplace, and problems in combining work and home duties were associated with pain among women. Among men combining work and home duties was not associated with pain, whereas organizational injustice showed associations. Pain affects functional capacity and predicts sickness absence. Those with pain reported lower level of both mental and physical functioning than those with no pain, physical functioning being more strongly affected than mental. Bodily location of pain or whether pain was acute or chronic had only minor impact on the variation in functioning, whereas the simple count of painful locations was associated with widest variation. Pain accounted for eight per cent of short term (1-3 day) sickness absence spells among men and 13 per cent among women. Of absence spells lasting between four and 14 days pain accounted for 23 per cent among women and 25 per cent among men, corresponding figures for over 14 day absence spells being 37 and 30 per cent. The association between pain and sickness absence was relatively independent of physical and psychosocial work factors, especially among women. The results of this study provide a picture of the epidemiology of pain among employees. Pain is a significant problem that seriously affects work ability. Information on risk groups can be utilized to make prevention measures more effective among those at high risk, and to decrease pain rates and thereby narrow the differences between socio-economic groups. Furthermore, the work-related psychosocial risk factors identified in this study are potentially modifiable, and it should be possible to target interventions on decreasing pain rates among employees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The influence of chemical mutation featuring the selective conversion of asparagine or glutamine to aspartic or glutamic acid, respectively, on the kinetics of refolding of reduced RNase has been studied. The monodeamidated derivatives of RNase A, viz. RNase Aa1a, Aa1b, and Aa1c having their deamidations in the region 67-74, were found to regain nearly their original enzymatic activity. However, a marked difference in the kinetics of refolding is seen, the order of regain of enzymic activity being RNase A greater than Aa1c congruent to Aa1a greater than Aa1b. The similarities in the distinct elution positions on Amberlite XE-64, gel electrophoretic mobilities, and u.v. spectra of reoxidized and native derivatives indicated that the native structures are formed. The slower rate of reappearance of enzymic activity in the case of the monodeamidated derivatives appears to result from altered interactions in the early stages of refolding. The roles of some amino acid residues of the 67-74 region in the pathway of refolding of RNase A are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

My work describes two sectors of the human bacterial environment: 1. The sources of exposure to infectious non-tuberculous mycobacteria. 2. Bacteria in dust, reflecting the airborne bacterial exposure in environments protecting from or predisposing to allergic disorders. Non-tuberculous mycobacteria (NTM) transmit to humans and animals from the environment. Infection by NTM in Finland has increased during the past decade beyond that by Mycobacterium tuberculosis. Among the farm animals, porcine mycobacteriosis is the predominant NTM disease in Finland. Symptoms of mycobacteriosis are found in 0.34 % of slaughtered pigs. Soil and drinking water are suspected as sources for humans and bedding materials for pigs. To achieve quantitative data on the sources of human and porcine NTM exposure, methods for quantitation of environmental NTM are needed. We developed a quantitative real-time PCR method, utilizing primers targeted at the 16S rRNA gene of the genus of Mycobacterium. With this method, I found in Finnish sphagnum peat, sandy soils and mud high contents of mycobacterial DNA, 106 to 107 genome equivalents per gram. A similar result was obtained by a method based on the Mycobacterium-specific hybridization of 16S rRNA. Since rRNA is found mainly in live cells, this result shows that the DNA detected by qPCR mainly represented live mycobacteria. Next, I investigated the occurrence of environmental mycobacteria in the bedding materials obtained from 5 pig farms with high prevalence (>4 %) of mycobacteriosis. When I used for quantification the same qPCR methods as for the soils, I found that piggery samples contained non-mycobacterial DNA that was amplified in spite of several mismatches with the primers. I therefore improved the qPCR assay by designing Mycobacterium-specific detection probes. Using the probe qPCR assay, I found 105 to 107 genome equivalents of mycobacterial DNA in unused bedding materials and up to 1000 fold more in the bedding collected after use in the piggery. This result shows that there was a source of mycobacteria in the bedding materials purchased by the piggery and that mycobacteria increased in the bedding materials during use in the piggery. Allergic diseases have reached epidemic proportions in urbanized countries. At the same time, childhood in rural environment or simple living conditions appears to protect against allergic disorders. Exposure to immunoreactive microbial components in rural environments seems to prevent allergies. I searched for differences in the bacterial communities of two indoor dusts, an urban house dust shown to possess immunoreactivity of the TH2-type and a farm barn dust with TH1-activity. The immunoreactivities of the dusts were revealed by my collaborators, in vitro in human dendritic cells and in vivo in mouse. The dusts accumulated >10 years in the respiratory zone (>1.5 m above floor), thus reflecting the long-term content of airborne bacteria at the two sites. I investigated these dusts by cloning and sequencing of bacterial 16S rRNA genes from dust contained DNA. From the TH2-active urban house dust, I isolated 139 16S rRNA gene clones. The most prevalent genera among the clones were Corynebacterium (5 species, 34 clones), Streptococcus (8 species, 33 clones), Staphylococcus (5 species, 9 clones) and Finegoldia (1 species, 9 clones). Almost all of these species are known as colonizers of the human skin and oral cavity. Species of Corynebacterium and Streptococcus have been reported to contain anti-inflammatory lipoarabinomannans and immunmoreactive beta-glucans respectively. Streptococcus mitis, found in the urban house dust is known as an inducer of TH2 polarized immunity, characteristic of allergic disorders. I isolated 152 DNA clones from the TH1-active farm barn dust and found species quite different from those found from the urban house dust. Among others, I found DNA clones representing Bacillus licheniformis, Acinetobacter lwoffii and Lactobacillus each of which was recently reported to possess anti-allergy immunoreactivity. Moreover, the farm barn dust contained dramatically higher bacterial diversity than the urban house dust. Exposure to this dust thus stimulated the human dendritic cells by multiple microbial components. Such stimulation was reported to promote TH1 immunity. The biodiversity in dust may thus be connected to its immunoreactivity. Furthermore, the bacterial biomass in the farm barn dust consisted of live intact bacteria mainly. In the urban house dust only ~1 % of the biomass appeared as intact bacteria, as judged by microscoping. Fragmented microbes may possess bioactivity different from that of intact cells. This was recently shown for moulds. If this is also valid for bacteria, the different immunoreactivities of the two dusts may be explained by the intactness of dustborne bacteria. Based on these results, we offer three factors potentially contributing to the polarized immunoreactivities of the two dusts: (i) the species-composition, (ii) the biodiversity and (iii) the intactness of the dustborne bacterial biomass. The risk of childhood atopic diseases is 4-fold lower in the Russian compared with the Finnish Karelia. This difference across the country border is not explainable by different geo-climatic factors or genetic susceptibilities of the two populations. Instead, the explanation must be lifestyle-related. It has already been reported that the microbiological quality of drinking water differs on the two sides of the borders. In collaboration with allergists, I investigated dusts collected from homes in the Russian Karelia and in the Finnish Karelia. I found that bacterial 16S rRNA genes cloned from the Russian Karelian dusts (10 homes, 234 clones) predominantly represented Gram-positive taxa (the phyla Actinobacteria and Firmicutes, 67%). The Russian Karelian dusts contained nine-fold more of muramic acid (60 to 70 ng mg-1) than the Finnish Karelian dusts (3 to 11 ng mg-1). Among the DNA clones isolated from the Finnish side (n=231), Gram-negative taxa (40%) outnumbered the Gram-positives (34%). Out of the 465 DNA clones isolated from the Karelian dusts, 242 were assigned to cultured validly described bacterial species. In Russian Karelia, animal-associated species e.g. Staphylococcus and Macrococcus were numerous (27 clones, 14 unique species). This finding may connect to the difference in the prevalence of allergy, as childhood contacts with pets and farm animals have been connected with low allergy risk. Plant-associated bacteria and plant-borne 16S rRNA genes (chloroplast) were frequent among the DNA clones isolated from the Finnish Karelia, indicating components originating from plants. In conclusion, my work revealed three major differences between the bacterial communtites in the Russian and in the Finnish Karelian homes: (i) the high prevalence of Gram-positive bacteria on the Russian side and of Gram-negative bacteria on the Finnish side and (ii) the rich presence of animal-associated bacteria on the Russian side whereas (iii) plant-associated bacteria prevailed on the Finnish side. One or several of these factors may connect to the differences in the prevalence of allergy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phase diagrams for Tm2O3-H2O-CO2. Yb2O3-H2O-CO2 and Lu2O3-H2O-CO2 systems at 650 and 1300 bars have been investigated in the temperature range of 100–800°C. The phase diagrams are far more complex than those for the lighter lanthanides. The stable phases are Ln(OH)3, Ln2(CO3)3.3H2O (tengerite phase), orthorhombic-LnOHCO3, hexagonal-Ln2O2CO3. LnOOH and cubic-Ln2O3. Ln(OH)3 is stable only at very low partial pressures of CO2. Additional phases stabilised are Ln2O(OH)2CO3and Ln6(OH)4(CO3)7 which are absent in lighter lanthanide systems. Other phases, isolated in the presence of minor alkali impurities, are Ln6O2(OH)8(CO3)3. Ln4(OH)6(CO3)3 and Ln12O7(OH)10,(CO3)6. The chemical equilibria prevailing in these hydrothermal systems may be best explained on the basis of the four-fold classification of lanthanides.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Options for the integrated management of white blister (caused by Albugo candida) of Brassica crops include the use of well timed overhead irrigation, resistant cultivars, programs of weekly fungicide sprays or strategic fungicide applications based on the disease risk prediction model, Brassica(spot)(TM). Initial systematic surveys of radish producers near Melbourne, Victoria, indicated that crops irrigated overhead in the morning (0800-1200 h) had a lower incidence of white blister than those irrigated overhead in the evening (2000-2400 h). A field trial was conducted from July to November 2008 on a broccoli crop located west of Melbourne to determine the efficacy and economics of different practices used for white blister control, modifying irrigation timing, growing a resistant cultivar and timing spray applications based on Brassica(spot)(TM). Growing the resistant cultivar, 'Tyson', instead of the susceptible cultivar, 'Ironman', reduced disease incidence on broccoli heads by 99 %. Overhead irrigation at 0400 h instead of 2000 h reduced disease incidence by 58 %. A weekly spray program or a spray regime based on either of two versions of the Brassica(spot)(TM) model provided similar disease control and reduced disease incidence by 72 to 83 %. However, use of the Brassica(spot)(TM) models greatly reduced the number of sprays required for control from 14 to one or two. An economic analysis showed that growing the more resistant cultivar increased farm profit per ha by 12 %, choosing morning irrigation by 3 % and using the disease risk predictive models compared with weekly sprays by 15 %. The disease risk predictive models were 4 % more profitable than the unsprayed control.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microneurovascular free muscle transfer with cross-over nerve grafts in facial reanimation Loss of facial symmetry and mimetic function as seen in facial paralysis has an enormous impact on the psychosocial conditions of the patients. Patients with severe long-term facial paralysis are often reanimated with a two-stage procedure combining cross-facial nerve grafting, and 6 to 8 months later with microneurovascular (MNV) muscle transfer. In this thesis, we recorded the long-term results of MNV surgery in facial paralysis and observed the possible contributing factors to final functional and aesthetic outcome after this procedure. Twenty-seven out of forty patients operated on were interviewed, and the functional outcome was graded. Magnetic resonance imaging (MRI) of MNV muscle flaps was done, and nerve graft samples (n=37) were obtained in second stage of the operation and muscle biopsies (n=18) were taken during secondary operations.. The structure of MNV muscles and nerve grafts was evaluated using histological and immunohistochemical methods ( Ki-67, anti-myosin fast, S-100, NF-200, CD-31, p75NGFR, VEGF, Flt-1, Flk-1). Statistical analysis was performed. In our studies, we found that almost two-thirds of the patients achieved good result in facial reanimation. The longer the follow-up time after muscle transfer the weaker was the muscle function. A majority of the patients (78%) defined their quality of life improved after surgery. In MRI study, the free MNV flaps were significantly smaller than originally. A correlation was found between good functional outcome and normal muscle structure in MRI. In muscle biopsies, the mean muscle fiber diameter was diminished to 40% compared to control values. Proliferative activity of satellite cells was seen in 60% of the samples and it tended to decline with an increase of follow-up time. All samples showed intramuscular innervation. Severe muscle atrophy correlated with prolonged intraoperative ischaemia. The good long-term functional outcome correlated with dominance of fast fibers in muscle grafts. In nerve grafts, the mean number of viable axons amounted to 38% of that in control samples. The grafted nerves characterized by fibrosis and regenerated axons were thinner than in control samples although they were well vascularized. A longer time between cross facial nerve grafting and biopsy sampling correlated with a higher number of viable axons. P75Nerve Growth Factor Receptor (p75NGFR) was expressed in every nerve graft sample. The expression of p75NGFR was lower in older than in younger patients. A high expression of p75NGFR was often seen with better function of the transplanted muscle. In grafted nerve Vascular Endothelial Growth Factor (VEGF) and its receptors were expressed in nervous tissue. In conclusion, most of the patients achieved good result in facial reanimation and were satisfied with the functional outcome. The mimic function was poorer in patients with longer follow-up time. MRI can be used to evaluate the structure of the microneurovascular muscle flaps. Regeneration of the muscle flaps was still going on many years after the transplantation and reinnervation was seen in all muscle samples. Grafted nerves were characterized by fibrosis and fewer, thinner axons compared to control nerves although they were well vascularized. P75NGFR and VEGF were expressed in human nerve grafts with higher intensity than in control nerves which is described for the first time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carotid artery disease is the most prevalent etiologic precursor of ischemic stroke, which is a major health hazard and the second most common cause of death in the world. If a patient presents with a symptomatic high-grade (>70%) stenosis in the internal carotid artery, the treatment of choice is carotid endarterectomy. However, the natural course of radiologically equivalent carotid lesions may be clinically quite diverse, and the reason for that is unknown. It would be of utmost importance to develop molecular markers that predict the symptomatic phenotype of an atherosclerotic carotid plaque (CP) and help to differentiate vulnerable lesions from stable ones. The aim of this study was to investigate the morphologic and molecular factors that associate with stroke-prone CPs. In addition to immunohistochemistry, DNA microarrays were utilized to identify molecular markers that would differentiate between symptomatic and asymptomatic CPs. Endothelial adhesion molecule expression (ICAM-1, VCAM-1, P-selectin, and E-selectin) did not differ between symptomatic and asymptomatic patients. Denudation of endothelial cells was associated with symptom-generating carotid lesions, but in studies on the mechanism of decay of endothelial cells, markers of apoptosis (TUNEL, activated caspase 3) were found to be decreased in the endothelium of symptomatic lesions. Furthermore, markers of endothelial apoptosis were directly associated with those of cell proliferation (Ki-67) in all plaques. FasL expression was significantly increased on the endothelium of symptomatic CPs. DNA microarray analysis revealed prominent induction of specific genes in symptomatic CPs, including those subserving iron and heme metabolism, namely HO-1, and hemoglobin scavenger receptor CD163. HO-1 and CD163 proteins were also increased in symptomatic CPs and associated with intraplaque iron deposits, which, however, did not correlate with symptom status itself. ADRP, the gene for adipophilin, was also overexpressed in symptomatic CPs. Adipophilin expression was markedly increased in ulcerated CPs and colocalized with extravasated red blood cells and cholesterol crystals. Taken together, the phenotypic characteristics and the numerous possible molecular mediators of the destabilization of carotid plaques provide potential platforms for future research. The denudation of the endothelial lining observed in symptomatic CPs may lead to direct thromboembolism and maintain harmful oxidative and inflammatory processes, predispose to plaque microhemorrhages, and contribute to lipid accumulation into the plaque, thereby making it vulnerable to rupture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The prevalence of variegate porphyria (VP) (2.1:100 000, in 2006 n=108) was higher in Finland than elsewhere in European countries due to a founder effect (R152C). The incidence of VP was estimated at 0.2:1 000 000 based on the number of new symptomatic patients yearly. The prevalence of porphyria cutanea tarda (PCT) was 1.2:100 000 (in 2006 n=63), which is only one fourth of the numbers reported from other European countries. The estimated incidence of PCT was 0.5:1 000 000. Based on measurements of the uroporphyrinogen decarboxylase activity in erythrocytes, the proportion of familial PCT was 49% of the cases. The prevalence of erythropoietic protoporphyria (EPP) was at 0.8:100 000 (in 2006 n=39) including asymptomatic carriers of a mutation in the ferrochelatase (FECH) gene. The incidence of EPP was estimated at 0.1:1 000 000. After 1980 the penetrance was 37% among patients with VP. Of the mutation carriers (n=57) 30% manifested with skin symptoms. Frequency of skin symptom as only clinical sign was stable before or after 1980 (22% vs. 21%), but acute attacks became infrequent (29% vs. 7%). Of the symptomatic patients 30% had both acute attacks and skin symptoms and 80% had skin symptoms. Fragility (95%) and blistering (46%) of the skin in the backs of the hands were the most common skin symptoms. Transient correction of porphyrin metabolism using eight haem arginate infusions within five weeks had no effect on the skin symptoms in three of four patients with VP. In one case skin symptoms disappeared transiently. One patient with homozygous VP had severe photosensitivity since birth. Sensory polyneuropathy, glaucoma and renal failure developed during the 25-year follow-up without the presence of acute attacks. The I12T mutation was detected in both of his alleles in the protoporphyrinogen oxidase gene. Lack of skin symptoms and infrequency of acute attacks (1/9) in the patients with I12T mutation at the heterozygous stage indicate a mild phenotype (the penetrance 11%). Four mutations (751delGAGAA, 1122delT, C286T, C343T) in the FECH gene were characterised in four of 15 families with EPP. Burning pain (96%) and swelling (92%) of the sun-exposed skin were the major skin symptoms. Hepatopathy appeared in one of 25 symptomatic patients (4%). Clinical manifestations and associated factors of PCT were similar in the sporadic and familial types of PCT. The majority of the patients with PCT had one to three precipitating factors: alcohol intake (78%), mutations in hemochromatosis associated gene (50%), use of oestrogen (25% of women) and hepatitis B or C infections (25 %). Fatty liver disease (67%) and siderosis (67%) were commonly found in their liver biopsies. The major histopathological change of the sun-exposed skin in the patients with VP (n=20), EPP (n=8) and PCT (n=5) was thickening of the vessel walls of the upper dermis suggesting that the vessel wall is the primary site of the phototoxic reaction in each type of porphyria. The fine structure of the vessel walls was similar in VP, EPP and PCT consisting of the multilayered basement membrane and excess of finely granular substance between the layers which were surrounded by the band of homogenous material. EPP was characterised by amorphous perivascular deposits extending also to the extravascular space. In direct immunofluorescence study homogenous IgG deposits in the vessel walls of the upper dermis of the sun-exposed skin were demonstrated in each type of porphyria. In EPP the excess material around vessel walls consisted of other proteins such as serum amyloid protein, and kappa and lambda light chains in addition to the basement membrane constituents such as collagen IV and laminin. These results suggest that the alterations of the vessel walls are a consequence of the repeated damage and the repairing process in the vessel wall. The microscopic alterations could be demonstrated even in the normal looking but sun-exposed skin of the patients with EPP during the symptom-free phase suggesting that vascular change can be chronic. The stability of vascular changes in the patients with PCT after treatment indicates that circulating porphyrins are not important for the maintenance of the changes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cervical cancer is the second most common cancer among women globally. Most, probably all cases, arise through a precursor, cervical intraepithelial neoplasia (CIN). Effective cytological screening programmes and surgical treatments of precancerous lesions have dramatically reduced its prevalence and related mortality. Although these treatments are effective, they may have adverse effects on future fertility and pregnancy outcomes. The aim of this study was to evaluate the effects of surgical treatment of the uterine cervix on pregnancy and fertility outcomes, with the focus particularly on preterm birth. The general preterm birth rates and risk factors during 1987–2005 were studied. Long-term mortality rates of the treated women were studied. In this study, information from The Medical Birth Register (MBR), The Hospital Discharge Register (HDR), The Cause-of-Death Register (CDR), and hospital records were used. Treatments were performed during 1987–2003 and subsequent deliveries, IVF treatments and deaths were analyzed. The general preterm birth rate in Finland was relatively stable, varying from 5.1% to 5.4% during the study period (1987 to 2005), although the proportion of extremely preterm births had decreased substantially by 12%.The main risk factor as regards preterm birth was multiplicity, followed by elective delivery (induction of delivery or elective cesarean section), primiparity, in vitro fertilization treatment, maternal smoking and advanced maternal age. The risk of preterm birth and low birth weight was increased after any cervical surgical treatment; after conization the risk of preterm birth was almost two-fold (RR 1.99, 95% CI 1.81– 2.20). In the conization group the risk was the highest for very preterm birth (28–31 gestational weeks) and it was also high for extremely preterm birth (less than 28 weeks). In this group the perinatal mortality was also increased. In subgroup analysis, laser ablation was not associated with preterm birth. When comparing deliveries before and after Loop conization, we found that the risk of preterm birth was increased 1.94-fold (95% CI 1.10–3.40). Adjusting for age, parity, or both did not affect our results. Large or repeat cones increased the risk of preterm birth when compared with smaller cones, suggesting that the size of the removed cone plays a role. This was corroborated by the finding that repeat treatment increased the risk as much as five-fold when compared with the background preterm birth rate. We found that the proportion of IVF deliveries (1.6% vs. 1.5%) was not increased after treatment for CIN when adjusted for year of delivery, maternal age, or parity. Those women who received both treatment for CIN and IVF treatment were older and more often primiparous, which explained the increased risk of preterm birth. We also found that mortality rates were 17% higher among women previously treated for CIN. This excess mortality was particularly seen as regards increased general disease mortality and alcohol poisoning (by 13%), suicide (by 67%) and injury death (by 31%). The risk of cervical cancer was high, as expected (SMR 7.69, 95% CI 4.23–11.15). Women treated for CIN and having a subsequent delivery had decreased general mortality rate (by -22%), and decreased disease mortality (by -37%). However, those with preterm birth had increased general mortality (SMR 2.51, 95% CI 1.24–3.78), as a result of cardiovascular diseases, alcohol-related causes, and injuries. In conclusion, the general preterm birth rate has not increased in Finland, as in many other developed countries. The rate of extremely preterm births has even decreased. While other risk factors of preterm birth, such as multiplicity and smoking during pregnancy have decreased, surgical treatments of the uterine cervix have become more important risk factors as regards preterm birth. Cervical conization is a predisposing factor as regards preterm birth, low birth weight and even perinatal mortality. The most frequently used treatment modality, Loop conization, is also associated with the increased risk of preterm birth. Treatments should be tailored individually; low-grade lesions should not be treated at all among young women. The first treatment should be curative, because repeat treatments are especially harmful. The proportion of IVF deliveries was not increased after treatment for CIN, suggesting that current treatment modalities do not strongly impair fertility. The long-term risk of cervical cancer remains high even after many years post-treatment; therefore careful surveillance is necessary. In addition, accidental deaths and deaths from injury were common among treated women, suggesting risk-taking behavior of these women. Preterm birth seems be associated with extremely high mortality rates, due to cardiovascular, alcohol-related and injury deaths. These women could benefit from health counseling, for example encouragement in quitting smoking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to estimate the prevalence and distribution of reduced visual acuity, major chronic eye diseases, and subsequent need for eye care services in the Finnish adult population comprising persons aged 30 years and older. In addition, we analyzed the effect of decreased vision on functioning and need for assistance using the World Health Organization’s (WHO) International Classification of Functioning, Disability, and Health (ICF) as a framework. The study was based on the Health 2000 health examination survey, a nationally representative population-based comprehensive survey of health and functional capacity carried out in 2000 to 2001 in Finland. The study sample representing the Finnish population aged 30 years and older was drawn by a two-stage stratified cluster sampling. The Health 2000 survey included a home interview and a comprehensive health examination conducted at a nearby screening center. If the invited participants did not attend, an abridged examination was conducted at home or in an institution. Based on our finding in participants, the great majority (96%) of Finnish adults had at least moderate visual acuity (VA ≥ 0.5) with current refraction correction, if any. However, in the age group 75–84 years the prevalence decreased to 81%, and after 85 years to 46%. In the population aged 30 years and older, the prevalence of habitual visual impairment (VA ≤ 0.25) was 1.6%, and 0.5% were blind (VA < 0.1). The prevalence of visual impairment increased significantly with age (p < 0.001), and after the age of 65 years the increase was sharp. Visual impairment was equally common for both sexes (OR 1.20, 95% CI 0.82 – 1.74). Based on self-reported and/or register-based data, the estimated total prevalences of cataract, glaucoma, age-related maculopathy (ARM), and diabetic retinopathy (DR) in the study population were 10%, 5%, 4%, and 1%, respectively. The prevalence of all of these chronic eye diseases increased with age (p < 0.001). Cataract and glaucoma were more common in women than in men (OR 1.55, 95% CI 1.26 – 1.91 and OR 1.57, 95% CI 1.24 – 1.98, respectively). The most prevalent eye diseases in people with visual impairment (VA ≤ 0.25) were ARM (37%), unoperated cataract (27%), glaucoma (22%), and DR (7%). One-half (58%) of visually impaired people had had a vision examination during the past five years, and 79% had received some vision rehabilitation services, mainly in the form of spectacles (70%). Only one-third (31%) had received formal low vision rehabilitation (i.e., fitting of low vision aids, receiving patient education, training for orientation and mobility, training for activities of daily living (ADL), or consultation with a social worker). People with low vision (VA 0.1 – 0.25) were less likely to have received formal low vision rehabilitation, magnifying glasses, or other low vision aids than blind people (VA < 0.1). Furthermore, low cognitive capacity and living in an institution were associated with limited use of vision rehabilitation services. Of the visually impaired living in the community, 71% reported a need for assistance and 24% had an unmet need for assistance in everyday activities. Prevalence of ADL, instrumental activities of daily living (IADL), and mobility increased with decreasing VA (p < 0.001). Visually impaired persons (VA ≤ 0.25) were four times more likely to have ADL disabilities than those with good VA (VA ≥ 0.8) after adjustment for sociodemographic and behavioral factors and chronic conditions (OR 4.36, 95% CI 2.44 – 7.78). Limitations in IADL and measured mobility were five times as likely (OR 4.82, 95% CI 2.38 – 9.76 and OR 5.37, 95% CI 2.44 – 7.78, respectively) and self-reported mobility limitations were three times as likely (OR 3.07, 95% CI 1.67 – 9.63) as in persons with good VA. The high prevalence of age-related eye diseases and subsequent visual impairment in the fastest growing segment of the population will result in a substantial increase in the demand for eye care services in the future. Many of the visually impaired, especially older persons with decreased cognitive capacity or living in an institution, have not had a recent vision examination and lack adequate low vision rehabilitation. This highlights the need for regular evaluation of visual function in the elderly and an active dissemination of information about rehabilitation services. Decreased VA is strongly associated with functional limitations, and even a slight decrease in VA was found to be associated with limited functioning. Thus, continuous efforts are needed to identify and treat eye diseases to maintain patients’ quality of life and to alleviate the social and economic burden of serious eye diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The outcome of the successfully resuscitated patient is mainly determined by the extent of hypoxic-ischemic cerebral injury, and hypothermia has multiple mechanisms of action in mitigating such injury. The present study was undertaken from 1997 to 2001 in Helsinki as a part of the European multicenter study Hypothermia after cardiac arrest (HACA) to test the neuroprotective effect of therapeutic hypothermia in patients resuscitated from out-of-hospital ventricular fibrillation (VF) cardiac arrest (CA). The aim of this substudy was to examine the neurological and cardiological outcome of these patients, and especially to study and develop methods for prediction of outcome in the hypothermia-treated patients. A total of 275 patients were randomized to the HACA trial in Europe. In Helsinki, 70 patients were enrolled in the study according to the inclusion criteria. Those randomized to hypothermia were actively cooled externally to a core temperature 33 ± 1ºC for 24 hours with a cooling device. Serum markers of ischemic neuronal injury, NSE and S-100B, were sampled at 24, 36, and 48 hours after CA. Somatosensory and brain stem auditory evoked potentials (SEPs and BAEPs) were recorded 24 to 28 hours after CA; 24-hour ambulatory electrocardiography recordings were performed three times during the first two weeks and arrhythmias and heart rate variability (HRV) were analyzed from the tapes. The clinical outcome was assessed 3 and 6 months after CA. Neuropsychological examinations were performed on the conscious survivors 3 months after the CA. Quantitative electroencephalography (Q-EEG) and auditory P300 event-related potentials were studied at the same time-point. Therapeutic hypothermia of 33ºC for 24 hours led to an increased chance of good neurological outcome and survival after out-of-hospital VF CA. In the HACA study, 55% of hypothermia-treated patients and 39% of normothermia-treated patients reached a good neurological outcome (p=0.009) at 6 months after CA. Use of therapeutic hypothermia was not associated with any increase in clinically significant arrhythmias. The levels of serum NSE, but not the levels of S-100B, were lower in hypothermia- than in normothermia-treated patients. A decrease in NSE values between 24 and 48 hours was associated with good outcome at 6 months after CA. Decreasing levels of serum NSE but not of S-100B over time may indicate selective attenuation of delayed neuronal death by therapeutic hypothermia, and the time-course of serum NSE between 24 and 48 hours after CA may help in clinical decision-making. In SEP recordings bilaterally absent N20 responses predicted permanent coma with a specificity of 100% in both treatment arms. Recording of BAEPs provided no additional benefit in outcome prediction. Preserved 24- to 48-hour HRV may be a predictor of favorable outcome in CA patients treated with hypothermia. At 3 months after CA, no differences appeared in any cognitive functions between the two groups: 67% of patients in the hypothermia and 44% patients in the normothermia group were cognitively intact or had only very mild impairment. No significant differences emerged in any of the Q-EEG parameters between the two groups. The amplitude of P300 potential was significantly higher in the hypothermia-treated group. These results give further support to the use of therapeutic hypothermia in patients with sudden out-of-hospital CA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Helicobacter pylori infection is usually acquired in early childhood and is rarely resolved spontaneously. Eradication therapy is currently recommended virtually to all patients. While the first and second therapies are prescribed without knowing the antibiotic resistance of the bacteria, it is important to know the primary resistance in the population. Aim: This study evaluates the primary resistance of H. pylori among patients in primary health care throughout Finland, the efficacy of three eradication regimens, the symptomatic response to successful therapy, and the effect of smoking on gastric histology and humoral response in H. pylori-positive patients. Patients and methods: A total of 23 endoscopy referral centres located throughout Finland recruited 342 adult patients with positive rapid urease test results, who were referred to upper gastrointestinal endoscopy from primary health care. Gastric histology, H. pylori resistance and H. pylori serology were evaluated. The patients were randomized to receive a seven-day regimen, comprising 1) lansoprazole 30 mg b.d., amoxicillin 1 g b.d. and metronidazole 400 mg t.d. (LAM), 2) lansoprazole 30 mg b.d., amoxicillin 1 g b.d. and clarithromycin 500 mg b.d. (LAC) or 3) ranitidine bismuth citrate 400 mg b.d., metronidazole 400 mg t.d. and tetracycline 500 mg q.d. (RMT). The eradication results were assessed, using the 13C-urea breath test 4 weeks after therapy. The patients completed a symptom questionnaire before and a year after the therapy. Results: Primary resistance of H. pylori to metronidazole was 48% among women and 25% among men. In women, metronidazole resistance correlated with previous use of antibiotics for gynaecologic infections and alcohol consumption. Resistance rate to clarithromycin was only 2%. Intention-to-treat cure rates of LAM, LAC, and RMT were 78%, 91% and 81%. While in metronidazole-sensitive cases the cure rates with LAM, LAC and RMT were similar, in metronidazole resistance LAM and RMT were inferior to LAC (53%, 67% and 84%). Previous antibiotic therapies reduced the efficacy of LAC, to the level of RMT. Dyspeptic symptoms in the Gastrointestinal Symptoms Rating Scale (GSRS) were decreased by 30.5%. In logistic regression analysis, duodenal ulcer, gastric antral neutrophilic inflammation and age from 50 to 59 years independently predicted greater decrease in dyspeptic symptoms. In the gastric body, smokers had milder inflammation and less atrophy and in the antrum denser H. pylori load. Smokers also had lower IgG antibody titres against H. pylori and a smaller proportional decrease in antibodies after successful eradication. Smoking tripled the risk of duodenal ulcers. Conclusions: in Finland H. pylori resistance to clarithromycin is low, but metronidazole resistance among women is high making metronidazole-based therapies unfavourable. Thus, LAC is the best choice for first-line eradication therapy. The effect of eradication on dyspeptic symptoms was only modest. Smoking slows the progression of atrophy in the gastric body.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pediatric renal transplantation (TX) has evolved greatly during the past few decades, and today TX is considered the standard care for children with end-stage renal disease. In Finland, 191 children had received renal transplants by October 2007, and 42% of them have already reached adulthood. Improvements in treatment of end-stage renal disease, surgical techniques, intensive care medicine, and in immunosuppressive therapy have paved the way to the current highly successful outcomes of pediatric transplantation. In children, the transplanted graft should last for decades, and normal growth and development should be guaranteed. These objectives set considerable requirements in optimizing and fine-tuning the post-operative therapy. Careful optimization of immunosuppressive therapy is crucial in protecting the graft against rejection, but also in protecting the patient against adverse effects of the medication. In the present study, the results of a retrospective investigation into individualized dosing of immunosuppresive medication, based on pharmacokinetic profiles, therapeutic drug monitoring, graft function and histology studies, and glucocorticoid biological activity determinations, are reported. Subgroups of a total of 178 patients, who received renal transplants in 1988 2006 were included in the study. The mean age at TX was 6.5 years, and approximately 26% of the patients were <2 years of age. The most common diagnosis leading to renal TX was congenital nephrosis of the Finnish type (NPHS1). Pediatric patients in Finland receive standard triple immunosuppression consisting of cyclosporine A (CsA), methylprednisolone (MP) and azathioprine (AZA) after renal TX. Optimal dosing of these agents is important to prevent rejections and preserve graft function in one hand, and to avoid the potentially serious adverse effects on the other hand. CsA has a narrow therapeutic window and individually variable pharmacokinetics. Therapeutic monitoring of CsA is, therefore, mandatory. Traditionally, CsA monitoring has been based on pre-dose trough levels (C0), but recent pharmacokinetic and clinical studies have revealed that the immunosuppressive effect may be related to diurnal CsA exposure and blood CsA concentration 0-4 hours after dosing. The two-hour post-dose concentration (C2) has proved a reliable surrogate marker of CsA exposure. Individual starting doses of CsA were analyzed in 65 patients. A recommended dose based on a pre-TX pharmacokinetic study was calculated for each patient by the pre-TX protocol. The predicted dose was clearly higher in the youngest children than in the older ones (22.9±10.4 and 10.5±5.1 mg/kg/d in patients <2 and >8 years of age, respectively). The actually administered oral doses of CsA were collected for three weeks after TX and compared to the pharmacokinetically predicted dose. After the TX, dosing of CsA was adjusted according to clinical parameters and blood CsA trough concentration. The pharmacokinetically predicted dose and patient age were the two significant parameters explaining post-TX doses of CsA. Accordingly, young children received significantly higher oral doses of CsA than the older ones. The correlation to the actually administered doses after TX was best in those patients, who had a predicted dose clearly higher or lower (> ±25%) than the average in their age-group. Due to the great individual variation in pharmacokinetics standardized dosing of CsA (based on body mass or surface area) may not be adequate. Pre-Tx profiles are helpful in determining suitable initial CsA doses. CsA monitoring based on trough and C2 concentrations was analyzed in 47 patients, who received renal transplants in 2001 2006. C0, C2 and experienced acute rejections were collected during the post-TX hospitalization, and also three months after TX when the first protocol core biopsy was obtained. The patients who remained rejection free had slightly higher C2 concentrations, especially very early after TX. However, after the first two weeks also the trough level was higher in the rejection-free patients than in those with acute rejections. Three months after TX the trough level was higher in patients with normal histology than in those with rejection changes in the routine biopsy. Monitoring of both the trough level and C2 may thus be warranted to guarantee sufficient peak concentration and baseline immunosuppression on one hand and to avoid over-exposure on the other hand. Controlling of rejection in the early months after transplantation is crucial as it may contribute to the development of long-term allograft nephropathy. Recently, it has become evident that immunoactivation fulfilling the histological criteria of acute rejection is possible in a well functioning graft with no clinical sings or laboratory perturbations. The influence of treatment of subclinical rejection, diagnosed in 3-month protocol biopsy, to graft function and histology 18 months after TX was analyzed in 22 patients and compared to 35 historical control patients. The incidence of subclinical rejection at three months was 43%, and the patients received a standard rejection treatment (a course of increased MP) and/or increased baseline immunosuppression, depending on the severity of rejection and graft function. Glomerular filtration rate (GFR) at 18 months was significantly better in the patients who were screened and treated for subclinical rejection in comparison to the historical patients (86.7±22.5 vs. 67.9±31.9 ml/min/1.73m2, respectively). The improvement was most remarkable in the youngest (<2 years) age group (94.1±11.0 vs. 67.9±26.8 ml/min/1.73m2). Histological findings of chronic allograft nephropathy were also more common in the historical patients in the 18-month protocol biopsy. All pediatric renal TX patients receive MP as a part of the baseline immunosuppression. Although the maintenance dose of MP is very low in the majority of the patients, the well-known steroid-related adverse affects are not uncommon. It has been shown in a previous study in Finnish pediatric TX patients that steroid exposure, measured as area under concentration-time curve (AUC), rather than the dose correlates with the adverse effects. In the present study, MP AUC was measured in sixteen stable maintenance patients, and a correlation with excess weight gain during 12 months after TX as well as with height deficit was found. A novel bioassay measuring the activation of glucocorticoid receptor dependent transcription cascade was also employed to assess the biological effect of MP. Glucocorticoid bioactivity was found to be related to the adverse effects, although the relationship was not as apparent as that with serum MP concentration. The findings in this study support individualized monitoring and adjustment of immunosuppression based on pharmacokinetics, graft function and histology. Pharmacokinetic profiles are helpful in estimating drug exposure and thus identifying the patients who might be at risk for excessive or insufficient immunosuppression. Individualized doses and monitoring of blood concentrations should definitely be employed with CsA, but possibly also with steroids. As an alternative to complete steroid withdrawal, individualized dosing based on drug exposure monitoring might help in avoiding the adverse effects. Early screening and treatment of subclinical immunoactivation is beneficial as it improves the prospects of good long-term graft function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Paikallisesti levinnyttä (T3-4 M0) ja luustoon levinnyttä (T1-4 M1) eturauhassyöpää sairastaneet potilaat satunnaistettiin kirurgiseen kastraatioon (orkiektomia) tai lääkkeelliseen kastraatioon lihaksensisäisellä polyestradiolifosfaatilla (PEP) annoksella 240 mg/kk. Verrattiin hoitojen kliinistä tehoa sekä sydän- ja verisuonikomplikaatioita (SV-komplikaatioita). Verrattiin myös hoitoa edeltäviä plasman testosteroni (T) ja estradioli (E2) pitoisuuksia T3-4 M0 ja T1-4 M1 potilaiden välillä sekä selvitettiin potilaiden yleistilan vaikutusta näihin hormonitasoihin. Lopuksi luotiin T1-4 M1 potilaille eturauhassyövän aiheuttaman kuoleman ennusteellinen riskiluokittelu kolmeen riskiryhmään käyttämällä hoitoa edeltäviä ennustetekijöitä. Kliinisessä tehossa ei orkiektomian ja PEP-hoidon välillä todettu tilastollisesti merkitsevää eroa. Odotetusti T1-4 M1 potilaiden ennuste oli huonompi kuin T3-4 M0 potilaiden. T1-4 M1 potilailla ei ollut SV-kuolemissa hoitoryhmien välillä tilastollista eroa, mutta ei-tappavia SV-komplikaatioita oli PEP ryhmässä (5.9%) enemmän kuin orkiektomia ryhmässä (2.0%). T3-4 M0 potilailla PEP-hoitoon liittyi tilastollisesti merkitsevä SV-kuolleisuus riski orkiektomiaan verrattuna (p = 0.001). PEP ryhmässä 67% kuolemista oli akuutteja sydäninfarkteja. Tämä PEP hoitoon liittyvä sydäninfarktiriski (mukaan lukien myös ei-tappavat sydäninfarktit) oli merkitsevästi pienempi potilailla, joiden hoitoa edeltävä E2 taso oli vähintään 93 pmol/l (p = 0.022). E2 taso oli merkitsevästi matalampi T1-4 M1 potilailla (74.7 pmol/l) kuin T3-4 M0 potilailla (87.9 pmol/l), mutta vastaavaa eroa ei ollut T tasoissa. Sekä T3-4 M0 että T1-4 M1 potilailla yleistilan lasku osittain selitti yksilöllisen T ja E2 tasojen laskun. Eturauhassyövän aiheuttaman kuoleman riskiryhmäluokittelu (Rg) kolmeen ryhmään luotiin käyttämällä alkalista fosfataasia (AFOS), prostata spesifistä antigeenia (PSA), laskoa (La) ja potilaan ikää. Yksi riskipiste annettiin, jos AFOS > 180 U/l (tällä hetkellä käytössä olevalla menetelmällä AFOS > 83 U/l), PSA > 35 µg/l, La > 80 mm/h ja ikä < 60 vuotta. Lopuksi pisteet laskettiin yhteen. Muodostettiin seuraavat ryhmät: Rg-a (0 -1 riskipistettä), Rg-b (2 riskipistettä) ja Rg-c (3 – 4 riskipistettä). Eturauhassyövän aiheuttama kuoleman riski lisääntyi merkitsevästi siirryttäessä riskiryhmästä seuraavaan (p < 0.001). Rg-luokittelu oli kliinisesti käytännöllinen ja hyvä havaitsemaan huonon ennusteen potilaat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Congenital heart defects include a wide range of inborn malformations. Depending on the defect, the life expectancy of a newborn with cardiac anomaly varies from a few days to a normal life span. In most instances surgery, is the only treatment available. The late results of surgery have not been comprehensively investigated. Aims: Mortality, morbidity and the life situation of all Finnish patients who had been operated on for congenital heart defect during childhood were investigated. Methods: Patient and surgical data were gathered from all hospitals that had performed heart surgeries on children. Late mortality and survival data were obtained from the population registry, and the causes of deaths from Statistics Finland. Morbidity of patients operated on during 1953-1989 was assessed by the usage of medicines. The pharmacotherapy data of patients and controls were obtained from the Social Insurance Institute. The life situation of patients was surveyed by mailed questionnaire. Survival, causes of deaths and life situation of patients were compared with those of the general population. Results: A total of 7240 cardiac operations were performed on 6461 children during the first 37 years of cardiac surgery (1953-1989). The number of procedures constantly rose during this period, and the increase continued in later years. The patient material varied over time, as more defects became surgically treatable. During 1953-1989 the operative mortality (death within 30 days of surgery) was 6.9%. In the 1990s a slight rise occurred in early mortality, as increasingly complicated patients were surgically treated. During 2000-2003 practically no defects were beyond the operative range. Thus, the operative mortality of 4.4% was excellent, decreasing even further to 2.0% in 2004-2007. The overall 45-year survival of patients operated on in 1953-1989 was 78%, and the corresponding figure for the general population was 93%. Survival depended on the defect, being worst among patients with univentricular heart. Late survival was also better during the 1990s and at the beginning of the 21st century. Of the 6028 early survivors, 592 died late (>30 days) after surgery. A total of 397 deaths (67%) were related and 185 (31%) unrelated to congenital heart defect. The cause of death was unknown in 10 cases. Of those 5774 patients who survived their first operation and had complete follow-up, 16% were operated on several times. Seventeen percent of patients used medicines for cardiac symptoms (heart failure, arrhythmia, hypertension and coronary disease). Patients risk of using cardiac medicines was 2.16 (Cl 1.97-2.37) times higher than that of controls. Patients also had more genetic syndromes and mental retardation and more often used medicines for asthma and epilepsy. Adult patients who had been operated on as children had coped surprisingly well with their defects. Their level of education was similar and their employment level even higher than expected, and they were living in a steady relationship as often as the general population. Conclusions: Cardiac surgery developed rapidly, and nowadays practically all defects can be treated. The overall survival of all operated patients was 78%, 16% less than that of the general population. However, it was significantly better than the anticipated natural survival. However, many patients had health problems; 16% needed reoperations and 17% cardiac medicines to maintain their condition. Most of the patients assessed their general health as good and lived a normal life.