944 resultados para serious


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the study was to evaluate gastrointestinal (GI) complications after kidney transplantation in the Finnish population. The adult patients included underwent kidney transplantation at Helsinki University Central Hospital in 1990-2000. Data on GI complications were collected from the Finnish Kidney Transplantation Registry, patient records and from questionnaires sent to patients. Helicobacter pylori IgG and IgA antibodies were measured from 500 patients before kidney transplantation and after a median 6.8-year follow up. Oesophagogastroduodenoscopy with biopsies was performed on 46 kidney transplantation patients suffering from gastroduodenal symptoms and 43 dyspeptic controls for studies of gastroduodenal cytomegalovirus (CMV) infection. Gallbladder ultrasound was performed on 304 patients after a median of 7.4 years post transplantation. Data from these 304 patients were also collected on serum lipids, body mass index and the use of statin medication. Severe GI complications occurred in 147 (10%) of 1515 kidney transplantations, 6% of them fatal after a median of 0.93 years. 51% of the complications occurred during the first post transplantation year, with highest incidence in gastroduodenal ulcers and complications of the colon. Patients with GI complications were older and had more delayed graft function and patients with polycystic kidney disease had more GI complications than the other patients. H.pylori seropositivity rate was 31% and this had no influence on graft or patient survival. 29% of the H.pylori seropositive patients seroreverted without eradication therapy. 74% of kidney transplantation patients had CMV specific matrix protein pp65 or delayed early protein p52 positive findings in the gastroduodenal mucosa, and 53% of the pp65 or p52 positive patients had gastroduodenal erosions without H.pylori findings. After the transplantation 165 (11%) patients developed gallstones. A biliary complication including 1 fatal cholecystitis developed in 15% of the patients with gallstones. 13 (0.9%) patients had pancreatitis. Colon perforations, 31% of them fatal, occurred in 16 (1%) patients. 13 (0.9%) developed a GI malignancy during the follow up. 2 H.pylori seropositive patients developed gastroduodenal malignancies during the follow up. In conclusion, severe GI complications usually occur early after kidney transplantation. Colon perforations are especially serious in kidney transplantation patients and colon diverticulosis and gallstones should be screened and treated before transplantation. When found, H.pylori infection should also be treated in these patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rural population of India constitutes about 70% of the total population and traditional fuels account for 75% of the rural energy needs. Depletion of woodlands coupled with the persistent dependency on fuel wood has posed a serious problem for household energy provision in many parts. This study highlights that the traditional fuels still meet 85-95% of fuel needs in rural areas of Kolar district: people prefer fuel wood for cooking and agriculture residues for water heating and other purposes. However, rapid changes in land cover and land use in recent times have affected these traditional fuels availability necessitating inventorying, mapping and monitoring of bioresources for sustainable management of bioresources. Remote sensing data (Multispectal and Panchromatic), Geographic Information System (GIS), field surveys and non-destructive sampling were used to assess spatially the availability and demand of energy. Field surveys indicate that rural household depends on species such as Prosopis juliflora, Acacia nilotica, Acacia auriculiformis to meet fuel wood requirement for domestic activities. Hence, to take stock of fuel wood availability, mapping was done at species level (with 88% accuracy) considering villages as sampling units using fused multispectral and panchromatic data. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study is part of an ongoing collaborative bipolar research project, the Jorvi Bipolar Study (JoBS). The JoBS is run by the Department of Mental Health and Alcohol Research of the National Public Health Institute, Helsinki, and the Department of Psychiatry, Jorvi Hospital, Helsinki University Central Hospital (HUCH), Espoo, Finland. It is a prospective, naturalistic cohort study of secondary level care psychiatric in- and outpatients with a new episode of bipolar disorder (BD). The second report also included 269 major depressive disorder (MDD) patients from the Vantaa Depression Study (VDS). The VDS was carried out in collaboration with the Department of Psychiatry of the Peijas Medical Care District. Using the Mood Disorder Questionnaire (MDQ), all in- and outpatients at the Department of Psychiatry at Jorvi Hospital who currently had a possible new phase of DSM-IV BD were sought. Altogether, 1630 psychiatric patients were screened, and 490 were interviewed using a semistructured interview (SCID-I/P). The patients included in the cohort (n=191) had at intake a current phase of BD. The patients were evaluated at intake and at 6- and 18-month interviews. Based on this study, BD is poorly recognized even in psychiatric settings. Of the BD patients with acute worsening of illness, 39% had never been correctly diagnosed. The classic presentations of BD with hospitalizations, manic episodes, and psychotic symptoms lead clinicians to correct diagnosis of BD I in psychiatric care. Time of follow-up elapsed in psychiatric care, but none of the clinical features, seemed to explain correct diagnosis of BD II, suggesting reliance on cross- sectional presentation of illness. Even though BD II was clearly less often correctly diagnosed than BD I, few other differences between the two types of BD were detected. BD I and II patients appeared to differ little in terms of clinical picture or comorbidity, and the prevalence of psychiatric comorbidity was strongly related to the current illness phase in both types. At the same time, the difference in outcome was clear. BD II patients spent about 40% more time depressed than BD I patients. Patterns of psychiatric comorbidity of BD and MDD differed somewhat qualitatively. Overall, MDD patients were likely to have more anxiety disorders and cluster A personality disorders, and bipolar patients to have more cluster B personality disorders. The adverse consequences of missing or delayed diagnosis are potentially serious. Thus, these findings strongly support the value of screening for BD in psychiatric settings, especially among the major depressive patients. Nevertheless, the diagnosis must be based on a clinical interview and follow-up of mood. Comorbidity, present in 59% of bipolar patients in a current phase, needs concomitant evaluation, follow-up, and treatment. To improve outcome in BD, treatment of bipolar depression is a major challenge for clinicians.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to estimate the prevalence and distribution of reduced visual acuity, major chronic eye diseases, and subsequent need for eye care services in the Finnish adult population comprising persons aged 30 years and older. In addition, we analyzed the effect of decreased vision on functioning and need for assistance using the World Health Organization’s (WHO) International Classification of Functioning, Disability, and Health (ICF) as a framework. The study was based on the Health 2000 health examination survey, a nationally representative population-based comprehensive survey of health and functional capacity carried out in 2000 to 2001 in Finland. The study sample representing the Finnish population aged 30 years and older was drawn by a two-stage stratified cluster sampling. The Health 2000 survey included a home interview and a comprehensive health examination conducted at a nearby screening center. If the invited participants did not attend, an abridged examination was conducted at home or in an institution. Based on our finding in participants, the great majority (96%) of Finnish adults had at least moderate visual acuity (VA ≥ 0.5) with current refraction correction, if any. However, in the age group 75–84 years the prevalence decreased to 81%, and after 85 years to 46%. In the population aged 30 years and older, the prevalence of habitual visual impairment (VA ≤ 0.25) was 1.6%, and 0.5% were blind (VA < 0.1). The prevalence of visual impairment increased significantly with age (p < 0.001), and after the age of 65 years the increase was sharp. Visual impairment was equally common for both sexes (OR 1.20, 95% CI 0.82 – 1.74). Based on self-reported and/or register-based data, the estimated total prevalences of cataract, glaucoma, age-related maculopathy (ARM), and diabetic retinopathy (DR) in the study population were 10%, 5%, 4%, and 1%, respectively. The prevalence of all of these chronic eye diseases increased with age (p < 0.001). Cataract and glaucoma were more common in women than in men (OR 1.55, 95% CI 1.26 – 1.91 and OR 1.57, 95% CI 1.24 – 1.98, respectively). The most prevalent eye diseases in people with visual impairment (VA ≤ 0.25) were ARM (37%), unoperated cataract (27%), glaucoma (22%), and DR (7%). One-half (58%) of visually impaired people had had a vision examination during the past five years, and 79% had received some vision rehabilitation services, mainly in the form of spectacles (70%). Only one-third (31%) had received formal low vision rehabilitation (i.e., fitting of low vision aids, receiving patient education, training for orientation and mobility, training for activities of daily living (ADL), or consultation with a social worker). People with low vision (VA 0.1 – 0.25) were less likely to have received formal low vision rehabilitation, magnifying glasses, or other low vision aids than blind people (VA < 0.1). Furthermore, low cognitive capacity and living in an institution were associated with limited use of vision rehabilitation services. Of the visually impaired living in the community, 71% reported a need for assistance and 24% had an unmet need for assistance in everyday activities. Prevalence of ADL, instrumental activities of daily living (IADL), and mobility increased with decreasing VA (p < 0.001). Visually impaired persons (VA ≤ 0.25) were four times more likely to have ADL disabilities than those with good VA (VA ≥ 0.8) after adjustment for sociodemographic and behavioral factors and chronic conditions (OR 4.36, 95% CI 2.44 – 7.78). Limitations in IADL and measured mobility were five times as likely (OR 4.82, 95% CI 2.38 – 9.76 and OR 5.37, 95% CI 2.44 – 7.78, respectively) and self-reported mobility limitations were three times as likely (OR 3.07, 95% CI 1.67 – 9.63) as in persons with good VA. The high prevalence of age-related eye diseases and subsequent visual impairment in the fastest growing segment of the population will result in a substantial increase in the demand for eye care services in the future. Many of the visually impaired, especially older persons with decreased cognitive capacity or living in an institution, have not had a recent vision examination and lack adequate low vision rehabilitation. This highlights the need for regular evaluation of visual function in the elderly and an active dissemination of information about rehabilitation services. Decreased VA is strongly associated with functional limitations, and even a slight decrease in VA was found to be associated with limited functioning. Thus, continuous efforts are needed to identify and treat eye diseases to maintain patients’ quality of life and to alleviate the social and economic burden of serious eye diseases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atrial fibrillation is the most common arrhythmia requiring treatment. This Thesis investigated atrial fibrillation (AF) with a specific emphasis on atrial remodeling which was analysed from epidemiological, clinical and magnetocardiographic (MCG) perspectives. In the first study we evaluated in real-life clinical practice a population-based cohort of AF patients referred for their first elective cardioversion (CV). 183 consecutive patients were included of whom in 153 (84%) sinus rhythm (SR) was restored. Only 39 (25%) of those maintained SR for one year. Shorter duration of AF and the use of sotalol were the only characteristics associated with better restoration and maintenance of SR. During the one-year follow-up 40% of the patients ended up in permanent AF. Female gender and older age were associated with the acceptance of permanent AF. The LIFE-trial was a prospective, randomised, double-blinded study that evaluated losartan and atenolol in patients with hypertension and left ventricular hypertrophy (LVH). Of the 8,851 patients with SR at baseline and without a history of AF 371 patients developed new-onset AF during the study. Patients with new-onset AF had an increased risk of cardiac events, stroke, and increased rate of hospitalisation for heart failure. Younger age, female gender, lower systolic blood pressure, lesser LVH in ECG and randomisation to losartan therapy were independently associated with lower frequency of new-onset AF. The impact of AF on morbidity and mortality was evaluated in a post-hoc analysis of the OPTIMAAL trial that compared losartan with captopril in patients with acute myocardial infarction (AMI) and evidence of LV dysfunction. Of the 5,477 randomised patients 655 had AF at baseline, and 345 patients developed new AF during the follow-up period, median 3.0 years. Older patients and patients with signs of more serious heart disease had and developed AF more often. Patients with AF at baseline had an increased risk of mortality (hazard ratio (HR) of 1.32) and stroke (HR 1.77). New-onset AF was associated with increased mortality (HR 1.82) and stroke (HR of 2.29). In the fourth study we assessed the reproducibility of our MCG method. This method was used in the fifth study where 26 patients with persistent AF had immediately after the CV longer P-wave duration and higher energy of the last portion of atrial signal (RMS40) in MCG, increased P-wave dispersion in SAECG and decreased pump function of the atria as well as enlarged atrial diameter in echocardiography compared to age- and disease-matched controls. After one month in SR, P-wave duration in MCG still remained longer and left atrial (LA) diameter greater compared to the controls, while the other measurements had returned to the same level as in the control group. In conclusion is not a rare condition in either general population or patients with hypertension or AMI, and it is associated with increased risk of morbidity and mortality. Therefore, atrial remodeling that increases the likelihood of AF and also seems to be relatively stable has to be identified and prevented. MCG was found to be an encouraging new method to study electrical atrial remodeling and reverse remodeling. RAAS-suppressing medications appear to be the most promising method to prevent atrial remodeling and AF.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intensive care is to be provided to patients benefiting from it, in an ethical, efficient, effective and cost-effective manner. This implies a long-term qualitative and quantitative analysis of intensive care procedures and related resources. The study population consists of 2709 patients treated in the general intensive care unit (ICU) of Helsinki University Hospital. Study sectors investigate intensive care patients mortality, quality of life (QOL), Quality-Adjusted Life-Years (QALY units) and factors related to severity of illness, length of stay (LOS), patient s age, evaluation period as well as experiences and memories connected with the ICU episode. In addition, the study examines the qualities of two QOL measures, the RAND 36 Item Health Survey 1.0 (RAND-36) and the 5 Item EuroQol-5D (EQ-5D) and assesses the correlation of the test results. Patients treated in 1995 responded to the RAND-36 questionnaire in 1996. All patients, treated from 1995-2000, received a QOL questionnaires in 2001, when 1 7 years had lapsed from the intensive treatment. Response rate was 79.5 %. Main Results 1) Of the patients who died within the first year (n = 1047) 66 % died during the intensive care period or within the following month. The non-survivors were more aged than the surviving patients, had generally a higher than average APACHE II and SOFA score depicting the severity of illness, their ICU LOS was longer and hospital stay shorter than of the surviving patients (p < 0.001). Mortality of patients receiving conservative treatment was higher than of those receiving surgical treatment. Patients replying to the QOL survey in 2001 (n = 1099) had recovered well: 97 % of those lived at home. More than half considered their QOL as good or extremely good, 40 % as satisfactory and 7 % as bad. All QOL indexes of those of working-age were considerably lower (p < 0.001) than comparable figures of the age- and gender-adjusted Finnish population. The 5-year monitoring period made evident that mental recovery was slower than physical recovery. 2) The results of RAND-36 and EQ-5D correlated well (p < 0.01). The RAND-36 profile measure distinguished more clearly between the different categories of QOL and their levels. EQ-5D measured well the patient groups general QOL and the sum index was used to calculate QALY units. 3) QALY units were calculated by multiplying the time the patient survived after ICU stay or expected life-years by the EQ-5D sum index. Aging automatically lowers the number of QALY units. Patients under the age of 65 receiving conservative treatment benefited from treatment to a greater extent measured in QALY units than their peers receiving surgical treatment, but in the age group 65 and over patients with surgical treatment received higher QALY ratings than recipients of conservative treatment. 4) The intensive care experience and QOL ratings were connected. The QOL indices were statistically highest for those recipients with memories of intensive care as a positive experience, albeit their illness requiring intensive care treatment was less serious than average. No statistically significant differences were found in the QOL indices of those with negative memories, no memories or those who did not express the quality of their experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lana Nowakowski's opinion piece on the High Court decision in the Zaburoni HIV case attacks "Queensland's absurd necessity to prove intention on transmission" and argues that "changes to the law are long overdue". Both claims are wrong...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seismic microzonation has generally been recognized as the most accepted tool in seismic hazard assessment and risk evaluation. In general, risk reduction can be done by reducing the hazard, the vulnerability or the value at risk. Since the earthquake hazard can not be reduced, one has to concentrate on vulnerability and value at risk. The vulnerability of an urban area / municipalities depends on the vulnerability of infrastructure and redundancies within the infrastructure. The earthquake risk is the damage to buildings along with number of people that are killed / hurt and the economic losses during the event due to an earthquake with a return period corresponding to this time period. The principal approaches one can follow to reduce these losses are to avoid, if possible, high hazard areas for the siting of buildings and infrastructure, and further ensure that the buildings and infrastructure are designed and constructed to resist expected earthquake loads. This can be done if one can assess the hazard at local scales. Seismic microzonation maps provide the basis for scientifically based decision-making to reduce earthquake risk for Govt./public agencies, private owners and the general public. Further, seismic microzonation carried out on an appropriate scale provides a valuable tool for disaster mitigation planning and emergency response planning for urban centers / municipalities. It provides the basis for the identification of the areas of the city / municipality which are most likely to experience serious damage in the event of an earthquake.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Juvenile idiopathic arthritis (JIA) is a heterogeneous group of childhood chronic arthritides, associated with chronic uveitis in 20% of cases. For JIA patients responding inadequately to conventional disease-modifying anti-rheumatic drugs (DMARDs), biologic therapies, anti-tumor necrosis factor (anti-TNF) agents are available. In this retrospective multicenter study, 258 JIA-patients refractory to DMARDs and receiving biologic agents during 1999-2007 were included. Prior to initiation of anti-TNFs, growth velocity of 71 patients was delayed in 75% and normal in 25%. Those with delayed growth demonstrated a significant increase in growth velocity after initiation of anti-TNFs. Increase in growth rate was unrelated to pubertal growth spurt. No change was observed in skeletal maturation before and after anti-TNFs. The strongest predictor of change in growth velocity was growth rate prior to anti-TNFs. Change in inflammatory activity remained a significant predictor even after decrease in glucocorticoids was taken into account. In JIA-associated uveitis, impact of two first-line biologic agents, etanercept and infliximab, and second-line or third-line anti-TNF agent, adalimumab, was evaluated. In 108 refractory JIA patients receiving etanercept or infliximab, uveitis occurred in 45 (42%). Uveitis improved in 14 (31%), no change was observed in 14 (31%), and in 17 (38%) uveitis worsened. Uveitis improved more frequently (p=0.047) and frequency of annual uveitis flares was lower (p=0.015) in those on infliximab than in those on etanercept. In 20 patients taking adalimumab, 19 (95%) had previously failed etanercept and/or infliximab. In 7 patients (35%) uveitis improved, in one (5%) worsened, and in 12 (60%) no change occurred. Those with improved uveitis were younger and had shorter disease duration. Serious adverse events (AEs) or side-effects were not observed. Adalimumab was effective also in arthritis. Long-term drug survival (i.e. continuation rate on drug) with etanercept (n=105) vs. infliximab (n=104) was at 24 months 68% vs. 68%, and at 48 months 61% vs. 48% (p=0.194 in log-rank analysis). First-line anti-TNF agent was discontinued either due to inefficacy (etanercept 28% vs. infliximab 20%, p=0.445), AEs (7% vs. 22%, p=0.002), or inactive disease (10% vs. 16%, p=0.068). Females, patients with systemic JIA (sJIA), and those taking infliximab as the first therapy were at higher risk for treatment discontinuation. One-third switched to the second anti-TNF agent, which was discontinued less often than the first. In conclusion, in refractory JIA anti-TNFs induced enhanced growth velocity. Four-year treatment survival was comparable between etanercept and infliximab, and switching from first-line to second-line agent a reasonable therapeutic option. During anti-TNF treatment, one-third with JIA-associated anterior uveitis improved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid increase in allergic diseases in developed, high-income countries during recent decades is attributed to several changes in the environment such as urbanization and improved hygiene. This relative lack of microbial stimulation is connected to a delay in maturation of the infantile immune system and seems to predispose especially genetically prone infants to allergic diseases. Probiotics, which are live ingestible health-promoting microbes, may compensate for the lack of microbial stimulation of the developing gut immune system and may thus be beneficial in prevention of allergies. Prebiotics, which are indigestible nutrients by us, promote the growth and activity of a number of bacterial strains considered beneficial for the gut. In a large cohort of 1 223 infants at hereditary risk for allergies we studied in a double-blind placebo-controlled manner whether probiotics administered in early life prevent allergic diseases from developing. We also evaluated their safety and their effects on common childhood infections, vaccine antibody responses, and intestinal immune markers. Pregnant mothers used a mixture of four probiotic bacteria or a placebo, from their 36th week of gestation. Their infants received the same probiotics plus prebiotic galacto-oligosaccharides for 6 months. The 2-year follow-up consisted of clinical examinations and allergy tests, fecal and blood sampling, and regular questionnaires. Among the 925 infants participating in the 2-year follow-up the cumulative incidence of any allergic disease (food allergy, eczema, asthma, rhinitis) was comparable in the probiotic (32%) and the placebo (35%) group. However, eczema, which was the most common manifestation (88%) of all allergic diseases, occurred less frequently in the probiotic (26%) than in the placebo group (32%). The preventive effect was more pronounced against atopic (IgE-associated) eczema which, of all atopic diseases, accounted for 92%. The relative risk reduction of eczema was 26% and of atopic eczema 34%. To prevent one case of eczema, the number of mother-infant pairs needed to treat was 16. Probiotic treatment was safe without any undesirable outcome for neonatal morbidity, feeding-related behavior, serious adverse events, growth, or for vaccine-induced antibody responses. Fewer infants in the probiotic than in the placebo group received antibiotics during their first 6 months of life and thereafter to age 2 years suffered from fewer respiratory tract infections. As a novel finding, we discovered that high fecal immunoglobulin A (IgA) concentrations at age 6 months associated with reduced risk for atopic (IgE-associated) diseases by age 2 years. In conclusion, although feeding probiotics to high-risk newborn infants showed no preventive effect on the cumulative incidence of any allergic diseases by age 2, they apparently prevented eczema. This probiotic effect was more pronounced among IgE-sensitized infants. The treatment was safe and seemed to stimulate maturation of the immune system as indicated by increased resistance to respiratory infections and improved vaccine antibody responses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pediatric renal transplantation (TX) has evolved greatly during the past few decades, and today TX is considered the standard care for children with end-stage renal disease. In Finland, 191 children had received renal transplants by October 2007, and 42% of them have already reached adulthood. Improvements in treatment of end-stage renal disease, surgical techniques, intensive care medicine, and in immunosuppressive therapy have paved the way to the current highly successful outcomes of pediatric transplantation. In children, the transplanted graft should last for decades, and normal growth and development should be guaranteed. These objectives set considerable requirements in optimizing and fine-tuning the post-operative therapy. Careful optimization of immunosuppressive therapy is crucial in protecting the graft against rejection, but also in protecting the patient against adverse effects of the medication. In the present study, the results of a retrospective investigation into individualized dosing of immunosuppresive medication, based on pharmacokinetic profiles, therapeutic drug monitoring, graft function and histology studies, and glucocorticoid biological activity determinations, are reported. Subgroups of a total of 178 patients, who received renal transplants in 1988 2006 were included in the study. The mean age at TX was 6.5 years, and approximately 26% of the patients were <2 years of age. The most common diagnosis leading to renal TX was congenital nephrosis of the Finnish type (NPHS1). Pediatric patients in Finland receive standard triple immunosuppression consisting of cyclosporine A (CsA), methylprednisolone (MP) and azathioprine (AZA) after renal TX. Optimal dosing of these agents is important to prevent rejections and preserve graft function in one hand, and to avoid the potentially serious adverse effects on the other hand. CsA has a narrow therapeutic window and individually variable pharmacokinetics. Therapeutic monitoring of CsA is, therefore, mandatory. Traditionally, CsA monitoring has been based on pre-dose trough levels (C0), but recent pharmacokinetic and clinical studies have revealed that the immunosuppressive effect may be related to diurnal CsA exposure and blood CsA concentration 0-4 hours after dosing. The two-hour post-dose concentration (C2) has proved a reliable surrogate marker of CsA exposure. Individual starting doses of CsA were analyzed in 65 patients. A recommended dose based on a pre-TX pharmacokinetic study was calculated for each patient by the pre-TX protocol. The predicted dose was clearly higher in the youngest children than in the older ones (22.9±10.4 and 10.5±5.1 mg/kg/d in patients <2 and >8 years of age, respectively). The actually administered oral doses of CsA were collected for three weeks after TX and compared to the pharmacokinetically predicted dose. After the TX, dosing of CsA was adjusted according to clinical parameters and blood CsA trough concentration. The pharmacokinetically predicted dose and patient age were the two significant parameters explaining post-TX doses of CsA. Accordingly, young children received significantly higher oral doses of CsA than the older ones. The correlation to the actually administered doses after TX was best in those patients, who had a predicted dose clearly higher or lower (> ±25%) than the average in their age-group. Due to the great individual variation in pharmacokinetics standardized dosing of CsA (based on body mass or surface area) may not be adequate. Pre-Tx profiles are helpful in determining suitable initial CsA doses. CsA monitoring based on trough and C2 concentrations was analyzed in 47 patients, who received renal transplants in 2001 2006. C0, C2 and experienced acute rejections were collected during the post-TX hospitalization, and also three months after TX when the first protocol core biopsy was obtained. The patients who remained rejection free had slightly higher C2 concentrations, especially very early after TX. However, after the first two weeks also the trough level was higher in the rejection-free patients than in those with acute rejections. Three months after TX the trough level was higher in patients with normal histology than in those with rejection changes in the routine biopsy. Monitoring of both the trough level and C2 may thus be warranted to guarantee sufficient peak concentration and baseline immunosuppression on one hand and to avoid over-exposure on the other hand. Controlling of rejection in the early months after transplantation is crucial as it may contribute to the development of long-term allograft nephropathy. Recently, it has become evident that immunoactivation fulfilling the histological criteria of acute rejection is possible in a well functioning graft with no clinical sings or laboratory perturbations. The influence of treatment of subclinical rejection, diagnosed in 3-month protocol biopsy, to graft function and histology 18 months after TX was analyzed in 22 patients and compared to 35 historical control patients. The incidence of subclinical rejection at three months was 43%, and the patients received a standard rejection treatment (a course of increased MP) and/or increased baseline immunosuppression, depending on the severity of rejection and graft function. Glomerular filtration rate (GFR) at 18 months was significantly better in the patients who were screened and treated for subclinical rejection in comparison to the historical patients (86.7±22.5 vs. 67.9±31.9 ml/min/1.73m2, respectively). The improvement was most remarkable in the youngest (<2 years) age group (94.1±11.0 vs. 67.9±26.8 ml/min/1.73m2). Histological findings of chronic allograft nephropathy were also more common in the historical patients in the 18-month protocol biopsy. All pediatric renal TX patients receive MP as a part of the baseline immunosuppression. Although the maintenance dose of MP is very low in the majority of the patients, the well-known steroid-related adverse affects are not uncommon. It has been shown in a previous study in Finnish pediatric TX patients that steroid exposure, measured as area under concentration-time curve (AUC), rather than the dose correlates with the adverse effects. In the present study, MP AUC was measured in sixteen stable maintenance patients, and a correlation with excess weight gain during 12 months after TX as well as with height deficit was found. A novel bioassay measuring the activation of glucocorticoid receptor dependent transcription cascade was also employed to assess the biological effect of MP. Glucocorticoid bioactivity was found to be related to the adverse effects, although the relationship was not as apparent as that with serum MP concentration. The findings in this study support individualized monitoring and adjustment of immunosuppression based on pharmacokinetics, graft function and histology. Pharmacokinetic profiles are helpful in estimating drug exposure and thus identifying the patients who might be at risk for excessive or insufficient immunosuppression. Individualized doses and monitoring of blood concentrations should definitely be employed with CsA, but possibly also with steroids. As an alternative to complete steroid withdrawal, individualized dosing based on drug exposure monitoring might help in avoiding the adverse effects. Early screening and treatment of subclinical immunoactivation is beneficial as it improves the prospects of good long-term graft function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nisäkkäillä keskushermoston uudistuminen on rajallista. Keskushermostovamman jälkeen aktivoituu monien paranemista edistävien tekijöiden lisäksi myös estäviä tekijöitä. Monella molekyylillä, kuten laminiinilla, on keskushermoston paranemista tehostava vaikutus. Laminiinit ovat myös kehon tyvikalvojen oleellisia rakennuskomponentteja. Keskushermoston laminiinit ovat tärkeitä sikiökehityksen aikana, esimerkiksi hermosäikeiden ohjauksessa. Myöhemmin ne osallistuvat veriaivoesteen ylläpitoon sekä vammojen jälkeiseen kudosreaktioon. Väitöskirjatutkimuksessani olen selvittänyt lamiiniinien, erityisesti γ1 laminiinin ja sen KDI peptidin, ekspressiota keskushermoston vammatilanteissa. Kokeellisessa soluviljelmäasetelmassa, joka simuloi vammautunutta keskushermostoympäristöä, osoitimme että KDI peptidi voimistaa sekä hermosolujen selviytymistä että hermosäikeiden kasvua. Kainihappo on glutamaattianalogi, ja glutamaattitoksisuudella uskotaan olevan tärkeä merkitys keskushermoston eri vamma- ja sairaustilanteissa tapahtuvassa hermosolukuolemassa. Toisessa väitöskirjani osatyössä osoitimme eläinmallissa KDI peptidin suojaavan rotan aivojen hippokampuksen hermosoluja kainihapon aiheuttamalta solutuholta. Elektrofysiologisilla mittauksilla osoitimme kolmannessa osatyössäni, että KDI peptidi estää glutamaattireseptorivirtoja ja suojaa siten glutamaattitoksisuudelta. Aivoveritulpan aiheuttama aivovaurio on yleinen syy aivohalvaukseen. Viimeisessä osatyössäni tutkimme eläinmallissa laminiinien ekspressiota iskemian vaurioittamassa aivokudoksessa. Laminiiniekspression todettiin voimistuvan vaurion jälkeen sekä tyvikalvo- että soluväliainerakenteissa. Vaurion ympärillä havaittiin astrosyyttejä, jotka jo melko aikaisessa vaiheessa vamman jälkeen ekspressoivat γ1 laminiinia ja KDI peptidiä. Tästä voidaan päätellä laminiinien osallistuvan aivoiskeemisen vaurion patofysiologiaan. Yleisesti väitöskirjatyöni kartoitti laminiinien ekspressiota sekä terveessä että vammautuneessa keskushermostossa. Väitöskirjatyöni tukee hypoteesia, jonka mukaan KDI peptidi suojaa keskushermostoa vaurioilta.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The prevalence of obesity is increasing at an alarming rate in all age groups worldwide. Obesity is a serious health problem due to increased risk of morbidity and mortality. Although environmental factors play a major role in the development of obesity, the identification of rare monogenic defects in human genes have confirmed that obesity has a strong genetic component. Mutations have been identified in genes encoding proteins of the leptin-melanocortin signaling system, which has an important role in the regulation of appetite and energy balance. The present study aimed at identifying mutations and genetic variations in the melanocortin receptors 2-5 and other genes active on the same signaling pathway accounting for severe early-onset obesity in children and morbid obesity in adults. The main achievement of this thesis was the identification of melanocortin-4 receptor (MC4R) mutations in Finnish patients. Six pathogenic MC4R mutations (308delT, P299H, two S127L and two -439delGC mutations) were identified, corresponding to a prevalence of 3% in severe early-onset obesity. No obesity causing MC4R mutations were found among patients with adult-onset morbid obesity. The MC4R 308delT deletion is predicted to result in a grossly truncated nonfunctional receptor of only 107 amino acids. The C-terminal residues, which are important in MC4R cell surface targeting, are totally absent from the mutant 308delT receptor. In vitro functional studies supported a pathogenic role for the S127L mutation since agonist induced signaling of the receptor was impaired. Cell membrane localization of the S127L receptor did not differ from that of the wild-type receptor, confirming that impaired function of the S127L receptor was due to reduced signaling properties. The P299H mutation leads to intracellular retention of the receptor. The -439delGC deletion is situated at a potential nescient helix-loop-helix 2 (NHLH2) -binding site in the MC4R promoter. It was demonstrated that the transcription factor NHLH2 binds to the consensus sequence at the -439delGC site in vitro, possibly resulting in altered promoter activity. Several genetic variants were identified in the melanocortin-3 receptor (MC3R) and pro-opiomelanocortin (POMC) genes. These polymorphisms do not explain morbid obesity, but the results indicate that some of these genetic variations may be modifying factors in obesity, resulting in subtle changes in obesity-related traits. A risk haplotype for obesity was identified in the ectonucleotide pyrophosphatase phosphodiesterase 1 (ENPP1) gene through a candidate gene single nucleotide polymorphism (SNP) genotyping approach. An ENPP1 haplotype, composed of SNPs rs1800949 and rs943003, was shown to be significantly associated with morbid obesity in adults. Accordingly, the MC3R, POMC and ENPP1 genes represent examples of susceptibility genes in which genetic variants predispose to obesity. In conclusion, pathogenic mutations in the MC4R gene were shown to account for 3% of cases with severe early-onset obesity in Finland. This is in line with results from other populations demonstrating that mutations in the MC4R gene underlie 1-6% of morbid obesity worldwide. MC4R deficiency thus represents the most common monogenic defect causing human obesity reported so far. The severity of the MC4-receptor defect appears to be associated with time of onset and the degree of obesity. Classification of MC4R mutations may provide a useful tool when predicting the outcome of the disease. In addition, several other genetic variants conferring susceptibility to obesity were detected in the MC3R, MC4R, POMC and ENPP1 genes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inadvertent failure of power transformers has serious consequences on the power system reliability, economics and the revenue accrual. Insulation is the weakest link in the power transformer prompting periodic inspection of the status of insulation at different points in time. A close Monitoring of the electrical, chemical and such other properties on insulation as are sensitive to the amount of time-dependent degradation becomes mandatory to judge the status of the equipment. Data-driven Diagnostic Testing and Condition Monitoring (DTCM) specific to power transformer is the aspect in focus. Authors develop a Monte Carlo approach for augmenting the rather scanty experimental data normally acquired using Proto-types of power transformers. Also described is a validation procedure for estimating the accuracy of the Model so developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective While driveway run-over incidents continue to be a cause of serious injury and deaths among young children in Australia, few empirically evaluated educational interventions have been developed which address these incidents. Addressing this gap, this study describes the development and evaluation of a paper-based driveway safety intervention targeting caregivers of children aged 5 years or younger. Design Cross-sectional survey. Method and setting Informed by previous research, the intervention targeted key caregiver safety behaviours that address driveway risks. To assess the impact of the intervention, 137 Queensland (Australia) caregivers (95.0% women; mean age = 34.97 years) were recruited. After receiving the intervention, changes to a number of outcomes such as caregiver risk perception, safety knowledge and behavioural intentions were measured. Results Findings indicated that the intervention had increased general and specific situational risk awareness and safety knowledge among a substantial proportion of participants. Close to one-quarter of the sample strongly agreed that the intervention had increased these outcomes. In addition, 71.6% of the sample reported that they intended to make changes to their routine in and around the driveway, as a result of reading the intervention material and a further, quarter of the participants strongly agreed that the information provided would be a help both to themselves (26.5%) and other caregivers (33.8%) to keep their children safe in the driveway. Conclusion: While the educational intervention requires further validation, findings from this study suggest that intervention content and format increases driveway safety.