946 resultados para prevalence rate
Resumo:
The purpose of this study was to estimate the prevalence and distribution of reduced visual acuity, major chronic eye diseases, and subsequent need for eye care services in the Finnish adult population comprising persons aged 30 years and older. In addition, we analyzed the effect of decreased vision on functioning and need for assistance using the World Health Organization’s (WHO) International Classification of Functioning, Disability, and Health (ICF) as a framework. The study was based on the Health 2000 health examination survey, a nationally representative population-based comprehensive survey of health and functional capacity carried out in 2000 to 2001 in Finland. The study sample representing the Finnish population aged 30 years and older was drawn by a two-stage stratified cluster sampling. The Health 2000 survey included a home interview and a comprehensive health examination conducted at a nearby screening center. If the invited participants did not attend, an abridged examination was conducted at home or in an institution. Based on our finding in participants, the great majority (96%) of Finnish adults had at least moderate visual acuity (VA ≥ 0.5) with current refraction correction, if any. However, in the age group 75–84 years the prevalence decreased to 81%, and after 85 years to 46%. In the population aged 30 years and older, the prevalence of habitual visual impairment (VA ≤ 0.25) was 1.6%, and 0.5% were blind (VA < 0.1). The prevalence of visual impairment increased significantly with age (p < 0.001), and after the age of 65 years the increase was sharp. Visual impairment was equally common for both sexes (OR 1.20, 95% CI 0.82 – 1.74). Based on self-reported and/or register-based data, the estimated total prevalences of cataract, glaucoma, age-related maculopathy (ARM), and diabetic retinopathy (DR) in the study population were 10%, 5%, 4%, and 1%, respectively. The prevalence of all of these chronic eye diseases increased with age (p < 0.001). Cataract and glaucoma were more common in women than in men (OR 1.55, 95% CI 1.26 – 1.91 and OR 1.57, 95% CI 1.24 – 1.98, respectively). The most prevalent eye diseases in people with visual impairment (VA ≤ 0.25) were ARM (37%), unoperated cataract (27%), glaucoma (22%), and DR (7%). One-half (58%) of visually impaired people had had a vision examination during the past five years, and 79% had received some vision rehabilitation services, mainly in the form of spectacles (70%). Only one-third (31%) had received formal low vision rehabilitation (i.e., fitting of low vision aids, receiving patient education, training for orientation and mobility, training for activities of daily living (ADL), or consultation with a social worker). People with low vision (VA 0.1 – 0.25) were less likely to have received formal low vision rehabilitation, magnifying glasses, or other low vision aids than blind people (VA < 0.1). Furthermore, low cognitive capacity and living in an institution were associated with limited use of vision rehabilitation services. Of the visually impaired living in the community, 71% reported a need for assistance and 24% had an unmet need for assistance in everyday activities. Prevalence of ADL, instrumental activities of daily living (IADL), and mobility increased with decreasing VA (p < 0.001). Visually impaired persons (VA ≤ 0.25) were four times more likely to have ADL disabilities than those with good VA (VA ≥ 0.8) after adjustment for sociodemographic and behavioral factors and chronic conditions (OR 4.36, 95% CI 2.44 – 7.78). Limitations in IADL and measured mobility were five times as likely (OR 4.82, 95% CI 2.38 – 9.76 and OR 5.37, 95% CI 2.44 – 7.78, respectively) and self-reported mobility limitations were three times as likely (OR 3.07, 95% CI 1.67 – 9.63) as in persons with good VA. The high prevalence of age-related eye diseases and subsequent visual impairment in the fastest growing segment of the population will result in a substantial increase in the demand for eye care services in the future. Many of the visually impaired, especially older persons with decreased cognitive capacity or living in an institution, have not had a recent vision examination and lack adequate low vision rehabilitation. This highlights the need for regular evaluation of visual function in the elderly and an active dissemination of information about rehabilitation services. Decreased VA is strongly associated with functional limitations, and even a slight decrease in VA was found to be associated with limited functioning. Thus, continuous efforts are needed to identify and treat eye diseases to maintain patients’ quality of life and to alleviate the social and economic burden of serious eye diseases.
Resumo:
In secondary steelmaking, the enhancement of the reaction rate in the low carbon period during the decarburization of steel is considered the most effective method to produce ultralow carbon steel. In a previous study, it was revealed that the surface reaction is dominant during the final stage of the actual refining process. In order to improve the surface reaction rate, it is necessary to enlarge the reaction region, which is usually achieved by increasing the plume eye area. In this study, water model experiments were carried out to estimate the influence of bottom stirring conditions on the gas-liquid reaction rate; for this purpose, the deoxidation rate during the bottom bubbling process was measured. Five types of nozzle configurations were used to study the effect of the plume eye area on the reaction rate at various gas flow rates. The results reveal that the surface reaction rate is influenced by the gas flow rate and the plume eye area. An empirical correlation was developed for the reaction rate and the plume eye area. This correlation was applied to estimate the gas-liquid reaction rate mat the bath surface.
Resumo:
In this paper, an achievable rate region for the three-user discrete memoryless interference channel with asymmetric transmitter cooperation is derived. The three-user channel facilitates different ways of message sharing between the transmitters. We introduce a manner of noncausal (genie aided) unidirectional message-sharing, which we term cumulative message sharing. We consider receivers with predetermined decoding capabilities, and define a cognitive interference channel. We then derive an achievable rate region for this channel by employing a coding scheme which is a combination of superposition and Gel'fand-Pinsker coding techniques.
Resumo:
Experiments are carried out with air as the test gas to obtain the surface convective heating rate on a missile shaped body flying at hypersonic speeds. The effect of fins on the surface heating rates of missile frustum is also investigated. The tests are performed in a hypersonic shock tunnel at stagnation enthalpy of 2 MJ/kg and zero degree angle of attack. The experiments are conducted at flow Mach number of 5.75 and 8 with an effective test time of 1 ms. The measured stagnation-point heat-transfer data compares well with the theoretical value estimated using Fay and Riddell expression. The measured heat-transfer rate with fin configuration is slightly higher than that of model without fin. The normalized values of experimentally measured heat transfer rate and Stanton number compare well with the numerically estimated results. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Acute renal failure (ARF) is a clinical syndrome characterized by rapidly decreasing glomerular filtration rate, which results in disturbances in electrolyte- and acid-base homeostasis, derangement of extracellular fluid volume, and retention of nitrogenous waste products, and is often associated with decreased urine output. ARF affects about 5-25% of patients admitted to intensive care units (ICUs), and is linked to high mortality and morbidity rates. In this thesis outcome of critically ill patients with ARF and factors related to outcome were evaluated. A total of 1662 patients from two ICUs and one acute dialysis unit in Helsinki University Hospital were included. In study I the prevalence of ARF was calculated and classified according to two ARF-specific scoring methods, the RIFLE classification and the classification created by Bellomo et al. (2001). Study II evaluated monocyte human histocompatibility leukocyte antigen-DR (HLA-DR) expression and plasma levels of one proinflammatory (interleukin (IL) 6) and two anti-inflammatory (IL-8 and IL-10) cytokines in predicting survival of critically ill ARF patients. Study III investigated serum cystatin C as a marker of renal function in ARF and its power in predicting survival of critically ill ARF patients. Study IV evaluated the effect of intermittent hemodiafiltration (HDF) on myoglobin elimination from plasma in severe rhabdomyolysis. Study V assessed long-term survival and health-related quality of life (HRQoL) in ARF patients. Neither of the ARF-specific scoring methods presented good discriminative power regarding hospital mortality. The maximum RIFLE score for the first three days in the ICU was an independent predictor of hospital mortality. As a marker of renal dysfunction, serum cystatin C failed to show benefit compared with plasma creatinine in detecting ARF or predicting patient survival. Neither cystatin C nor plasma concentrations of IL-6, IL-8, and IL-10, nor monocyte HLA-DR expression were clinically useful in predicting mortality in ARF patients. HDF may be used to clear myoglobin from plasma in rhabdomyolysis, especially if the alkalization of diuresis does not succeed. The long-term survival of patients with ARF was found to be poor. The HRQoL of those who survive is lower than that of the age- and gender-matched general population.
Resumo:
The purpose of this study was to evaluate the use of sentinel node biopsy (SNB) in the axillary nodal staging in breast cancer. A special interest was in sentinel node (SN) visualization, intraoperative detection of SN metastases, the feasibility of SNB in patients with pure tubular carcinoma (PTC) and in those with ductal carcinoma in situ (DCIS) in core needle biopsy (CNB) and additionally in the detection of axillary recurrences after tumour negative SNB. Patients and methods. 1580 clinically stage T1-T2 node-negative breast cancer patients, who underwent lymphoscintigraphy (LS), SNB and breast surgery between June 2000 - 2004 at the Breast Surgery Unit. The CNB samples were obtained from women, who participated the biennial, population based mammography screening at the Mammography Screening Centre of Helsinki 2001 - 2004.In the follow- up, a cohort of 205 patients who avoided AC due to negative SNB findings were evaluated using ultrasonography one and three years after breast surgery. Results. The visualization rate of axillary SNs was not enhanced by adjusting radioisotope doses according to BMI. The sensitivity of the intraoperative diagnosis of SN metastases of invasive lobular carcinoma (ILC) was higher, 87%, with rapid, intraoperative immunohistochemistry (IHC) group compared to 66% without it. The prevalence of tumour positive SN findings was 27% in the 33 patients with breast tumours diagnosed as PTC. The median histological tumour size was similar in patients with or without axillary metastases. After the histopathological review, six out of 27 patients with true PTC had axillary metastases, with no significant change in the risk factors for axillary metastases. Of the 67 patients with DCIS in the preoperative percutaneous biopsy specimen , 30% had invasion in the surgical specimen. The strongest predictive factor for invasion was the visibility of the lesion in ultrasound. In the three year follow-up, axillary recurrence was found in only two (0.5%) of the total of 383 ultrasound examinations performed during the study, and only one of the 369 examinations revealed cancer. None of the ultrasound examinations were false positive, and no study participant was subjected to unnecessary surgery due to ultrasound monitoring. Conclusions. Adjusting the dose of the radioactive tracer according to patient BMI does not increase the visualization rate of SNs. The intraoperative diagnosis of SN metastases is enhanced by rapid IHC particularly in patients with ILC. SNB seems to be a feasible method for axillary staging of pure tubular carcinoma in patients with a low prevalence of axillary metatastases. SNB also appears to be a sensible method in patients undergoing mastectomy due to DCIS in CNB. It also seems useful in patients with lesions visible in breast US. During follow-up, routine monitoring of the ipsilateral axilla using US is not worthwhile among breast cancer patients who avoided AC due to negative SN findings.
Resumo:
Background: Irritable bowel syndrome (IBS) is a common functional gastrointestinal (GI) disorder characterised by abdominal pain and abnormal bowel function. It is associated with a high rate of healthcare consumption and significant health care costs. The prevalence and economic burden of IBS in Finland has not been studied before. The aims of this study were to assess the prevalence of IBS according to various diagnostic criteria and to study the rates of psychiatric and somatic comorbidity in IBS. In addition, health care consumption and societal costs of IBS were to be evaluated. Methods: The study was a two-phase postal survey. Questionnaire I identifying IBS by Manning 2 (at least two of the six Manning symptoms), Manning 3 (at least three Manning symptoms), Rome I, and Rome II criteria, was mailed to a random sample of 5 000 working age subjects. It also covered extra-GI symptoms such as headache, back pain, and depression. Questionnaire II, covering rates of physician visits, and use of GI medication, was sent to subjects fulfilling Manning 2 or Rome II IBS criteria in Questionnaire I. Results: The response rate was 73% and 86% for questionnaires I and II. The prevalence of IBS was 15.9%, 9.6%, 5.6%, and 5.1% according to Manning 2, Manning 3, Rome I, and Rome II criteria. Of those meeting Rome II criteria, 97% also met Manning 2 criteria. Presence of severe abdominal pain was more often reported by subjects meeting either of the Rome criteria than those meeting either of the Manning criteria. Presence of depression, anxiety, and several somatic symptoms was more common among subjects meeting any IBS criterion than by controls. Of subjects with depressive symptoms, 11.6% met Rome II IBS criteria compared to 3.7% of those with no depressiveness. Subjects meeting any IBS criteria made more physician visits than controls. Intensity of GI symptoms and presence of dyspeptic symptoms were the strongest predictors of GI consultations. Presence of dyspeptic symptoms and a history of abdominal pain in childhood also predicted non-GI visits. Annual GI related individual costs were higher in the Rome II group (497 ) than in the Manning 2 group (295 ). Direct expenses of GI symptoms and non GI physician visits ranged between 98M for Rome II and 230M for Manning 2 criteria. Conclusions: The prevalence of IBS varies substantially depending on the criteria applied. Rome II criteria are more restrictive than Manning 2, and they identify an IBS population with more severe GI symptoms, more frequent health care use, and higher individual health care costs. Subjects with IBS demonstrate high rates of psychiatric and somatic comorbidity regardless of health care seeking status. Perceived symptom severity rather than psychiatric comorbidity predicts health care seeking for GI symptoms. IBS incurs considerable medical costs. The direct GI and non-GI costs are equivalent to up to 5% of outpatient health care and medicine costs in Finland. A more integral approach to IBS by physicians, accounting also for comorbid conditions, may produce a more favourable course in IBS patients and reduce health care expenditures.
Resumo:
Most of the diseases affecting public health, like hypertension, are multifactorial by etiology. Hypertension is influenced by genetic, life style and environmental factors. Estimation of the influence of genes to the risk of essential hypertension varies from 30 to 50%. It is plausible that in most of the cases susceptibility to hypertension is determined by the action of more than one gene. Although the exact molecular mechanism underlying essential hypertension remains obscure, several monogenic forms of hypertension have been identified. Since common genetic variations may predict, not only to susceptibility to hypertension, but also response to antihypertensive drug therapy, pharmacogenetic approaches may provide useful markers in finding relations between candidate genes and phenotypes of hypertension. The aim of this study was to identify genetic mutations and polymorphisms contributing to human hypertension, and examine their relationships to intermediate phenotypes of hypertension, such as blood pressure (BP) responses to antihypertensive drugs or biochemical laboratory values. Two groups of patients were investigated in the present study. The first group was collected from the database of patients investigated in the Hypertension Outpatient Ward, Helsinki University Central Hospital, and consisted of 399 subjects considered to have essential hypertension. Frequncies of the mutant or variant alleles were compared with those in two reference groups, healthy blood donors (n = 301) and normotensive males (n = 175). The second group of subjects with hypertension was collected prospectively. The study subjects (n=313) underwent a protocol lasting eight months, including four one-month drug treatment periods with antihypertensive medications (thiazide diuretic, β-blocker, calcium channel antagonist, and an angiotensin II receptor antagonist). BP responses and laboratory values were related to polymorphims of several candidate genes of the renin-angiotensin system (RAS). In addition, two patients with typical features of Liddle’s syndrome were screened for mutations in kidney epithelial sodium channel (ENaC) subunits. Two novel mutations causing Liddle’s syndrome were identified. The first mutation identified located in the beta-subunit of ENaC and the second mutation found located in the gamma-subunit, constituting the first identified Liddle mutation locating in the extracellular domain. This mutation showed 2-fold increase in channel activity in vitro. Three gene variants, of which two are novel, were identified in ENaC subunits. The prevalence of the variants was three times higher in hypertensive patients (9%) than in reference groups (3%). The variant carriers had increased daily urinary potassium excretion rate in relation to their renin levels compared with controls suggesting increased ENaC activity, although in vitro they did not show increased channel activity. Of the common polymorphisms of the RAS studied, angiotensin II receptor type I (AGTR1) 1166 A/C polymorphism was associated with modest changes in RAS activity. Thus, patients homozygous for the C allele tended to have increased aldosterone and decreased renin levels. In vitro functional studies using transfected HEK293 cells provided additional evidence that the AGTR1 1166 C allele may be associated with increased expression of the AGTR1. Common polymorphisms of the alpha-adducin and the RAS genes did not significantly predict BP responses to one-month monotherapies with hydroclorothiazide, bisoprolol, amlodipin, or losartan. In conclusion, two novel mutations of ENaC subunits causing Liddle’s syndrome were identified. In addition, three common ENaC polymorphisms were shown to be associated with occurrence of essential hypertension, but their exact functional and clinical consequences remain to be explored. The AGTR1 1166 C allele may modify the endocrine phenotype of hypertensive patients, when present in homozygous form. Certain widely studied polymorphisms of the ACE, angiotensinogen, AGTR1 and alpha-adducin genes did not significantly affect responses to a thiazide, β-blocker, calcium channel antagonist, and angiotensin II receptor antagonist.
Resumo:
Rest tremor, rigidity, and slowness of movements-considered to be mainly due to markedly reduced levels of dopamine (DA) in the basal ganglia-are characteristic motor symptoms of Parkinson's disease (PD). Although there is yet no cure for this illness, several drugs can alleviate the motor symptoms. Among these symptomatic therapies, L-dopa is the most effective. As a precursor to DA, it is able to replace the loss of DA in the basal ganglia. In the long run L-dopa has, however, disadvantages. Motor response complications, such as shortening of the duration of drug effect ("wearing-off"), develop in many patients. In addition, extensive peripheral metabolism of L-dopa by aromatic amino acid decarboxylase and catechol-O-methyltransferase (COMT) results in its short half-life, low bioavailability, and reduced efficacy. Entacapone, a nitrocatechol-structured compound, is a highly selective, reversible, and orally active inhibitor of COMT. It increases the bioavailability of L-dopa by reducing its peripheral elimination rate. Entacapone extends the duration of clinical response to each L-dopa dose in PD patients with wearing-off fluctuations. COMT is important in the metabolism of catecholamines. Its inhibition could, therefore, theoretically lead to adverse cardiovascular reactions, especially in circumstances of enhanced sympathetic activity (physical exercise). PD patients may be particularly vulnerable to such effects due to high prevalence of cardiovascular autonomic dysfunction, and the common use of monoamine oxidase B inhibitor selegiline, another drug with effects on catecholamine metabolism. Both entacapone and selegiline enhance L-dopa's clinical effect. Their co-administration may therefore lead to pharmacodynamic interactions, either beneficial (improved L-dopa efficacy) or harmful (increased dyskinesia). We investigated the effects of repeated dosing (3-5 daily doses for 1-2 weeks) of entacapone 200 mg administered either with or without selegiline (10 mg once daily), on several safety and efficacy parameters in 39 L-dopa-treated patients with mild to moderate PD in three double-blind placebo-controlled, crossover studies. In the first two, the cardiovascular, clinical, and biochemical responses were assessed repeatedly for 6 hours after drug intake, first with L-dopa only (control), and then after a 2 weeks on study drugs (entacapone vs. entacapone plus selegiline in one; entacapone vs. selegiline vs. entacapone plus selegiline in the other). The third study included cardiovascular reflex and spiroergometric exercise testing, first after overnight L-dopa withdrawal (control), and then after 1 week on entacapone plus selegiline as adjuncts to L-dopa. Ambulatory ECG was recorded in two of the studies. Blood pressure, heart rate, ECG, cardiovascular autonomic function, cardiorespiratory exercise responses, and the resting/exercise levels of circulating catecholamines remained unaffected by entacapone, irrespective of selegiline. Entacapone significantly enhanced both L-dopa bioavailability and its clinical response, the latter being more pronounced with the co-administration of selegiline. Dyskinesias were also increased during simultaneous use of both entacapone and selegiline as L-dopa adjuncts. Entacapone had no effect on either work capacity or work efficiency. The drug was well tolerated, both with and without selegiline. Conclusions: the use of entacapone-either alone or combined with selegiline-seems to be hemodynamically safe in L-dopa-treated PD patients, also during maximal physical effort. This is in line with the safety experience from larger phase III studies. Entacapone had no effect on cardiovascular autonomic function. Concomitant administration of entacapone and selegiline may enhance L-dopa's clinical efficacy but may also lead to increased dyskinesia.
Resumo:
Objective: We aimed to assess the impact of task demands and individual characteristics on threat detection in baggage screeners. Background: Airport security staff work under time constraints to ensure optimal threat detection. Understanding the impact of individual characteristics and task demands on performance is vital to ensure accurate threat detection. Method: We examined threat detection in baggage screeners as a function of event rate (i.e., number of bags per minute) and time on task across 4 months. We measured performance in terms of the accuracy of detection of Fictitious Threat Items (FTIs) randomly superimposed on X-ray images of real passenger bags. Results: Analyses of the percentage of correct FTI identifications (hits) show that longer shifts with high baggage throughput result in worse threat detection. Importantly, these significant performance decrements emerge within the first 10 min of these busy screening shifts only. Conclusion: Longer shift lengths, especially when combined with high baggage throughput, increase the likelihood that threats go undetected. Application: Shorter shift rotations, although perhaps difficult to implement during busy screening periods, would ensure more consistently high vigilance in baggage screeners and, therefore, optimal threat detection and passenger safety.
Resumo:
Rheumatoid arthritis (RA) patients have premature mortality. Contrary to the general population, mortality in RA has not declined over time. This study aimed to evaluate determinants of mortality in RA by examining causes of death (CoDs) over time, accuracy of CoD diagnoses, and contribution of RA medication to CoDs. This study further evaluated detection rate of reactive systemic amyloid A amyloidosis, which is an important contributor to RA mortality. CoDs were examined in 960 RA patients between 1971 and 1991 (Study population A) and in 369 RA patients autopsied from 1952 to 1991, with non-RA patients serving as the reference cases (Study population B). In Study population B, CoDs by the clinician before autopsy were compared to those by the pathologist at autopsy to study accuracy of CoD diagnoses. In Study population B, autopsy tissue samples were re-examined systematically for amyloidosis (90% of patients) and clinical data for RA patients was studied from 1973. RA patients died most frequently of cardiovascular diseases (CVDs), infections, and RA. RA deaths declined over time. Coronary deaths showed no major change in Study population A, but, in Study population B, coronary deaths in RA patients increased from 1952 to 1991, while non-RA cases had a decrease in coronary deaths starting in the 1970s. Between CoD diagnoses by the clinician and those by the pathologist, RA patients had lower agreement than non-RA cases regarding cardiovascular (Kappa reliability measure: 0.31 vs. 0.51) and coronary deaths (0.33 vs. 0.46). Use of disease modifying anti-rheumatic drugs was not associated with any CoD. In RA patients, re-examination of autopsy tissue samples doubled the prevalence of amyloid compared with the original autopsy: from 18% to 30%. In the amyloid-positive RA patients, amyloidosis was diagnosed before autopsy in only 37%; and they had higher inflammatory levels and longer duration of RA than amyloid-negative RA patients. Of the RA patients with amyloid, only half had renal failure or proteinuria during lifetime. In RA, most important determinants of mortality were CVDs, RA, and infections. In RA patients, RA deaths decreased over time, but this was not true for coronary deaths. Coronary death being less accurately diagnosed in RA may indicate that coronary heart disease (CHD) often goes unrecognized during lifetime. Thus, active search for CHD and its effective treatment is important to reduce cardiovascular mortality. Reactive amyloidosis may often go undetected. In RA patients with proteinuria or renal failure, as well as with active and long-lasting RA, a systematic search for amyloid is important to enable early diagnosis and early enhancement of therapy. This is essential to prevent clinical manifestations of amyloidosis such as renal failure, which has a poor prognosis.
Resumo:
Background. Kidney transplantation (KTX) is considered to be the best treatment of terminal uremia. Despite improvements in short-term graft survival, a considerable number of kidney allografts are lost due to the premature death of patients with a functional kidney and to chronic allograft nephropathy (CAN). Aim. To investigate the risk factors involved in the progression of CAN and to analyze diagnostic methods for this entity. Materials and methods. Altogether, 153 implant and 364 protocol biopsies obtained between June 1996 and April 2008 were analyzed. The biopsies were classified according to Banff ’97 and chronic allograft damage index (CADI). Immunohistochemistry for TGF-β1 was performed in 49 biopsies. Kidney function was evaluated by creatinine and/or cystatin C measurement and by various estimates of glomerular filtration rate (GFR). Demographic data of the donors and recipients were recorded after 2 years’ follow-up. Results. Most of the 3-month biopsies (73%) were nearly normal. The mean CADI score in the 6-month biopsies decreased significantly after 2001. Diastolic hypertension correlated with ΔCADI. Serum creatinine concentration at hospital discharge and glomerulosclerosis were risk factors for ΔCADI. High total and LDL cholesterol, low HDL and hypertension correlated with chronic histological changes. The mean age of the donors increased from 41 -52 years. Older donors were more often women who had died from an underlying disease. The prevalence of delayed graft function increased over the years, while acute rejections (AR) decreased significantly over the years. Sub-clinical AR was observed in 4% and it did not affect long-term allograft function or CADI. Recipients´ drug treatment was modified along the Studies, being mycophenolate mophetil, tacrolimus, statins and blockers of the renine-angiotensin-system more frequently prescribed after 2001. Patients with a higher ΔCADI had lower GFR during follow-up. CADI over 2 was best predicted by creatinine, although with modest sensitivity and specificity. Neither cystatin C nor other estimates of GFR were superior to creatinine for CADI prediction. Cyclosporine A toxicity was seldom seen. Low cyclosporin A concentration after 2 h correlated with TGF- β1 expression in interstitial inflammatory cells, and this predicted worse graft function. Conclusions. The progression of CAN has been affected by two major factors: the donors’ characteristics and the recipients’ hypertension. The increased prevalence of DGF might be a consequence of the acceptance of older donors who had died from an underlying disease. Implant biopsies proved to be of prognostic value, and they are essential for comparison with subsequent biopsies. The progression of histological damage was associated with hypertension and dyslipidemia. The augmented expression of TGF-β1 in inflammatory cells is unclear, but it may be related to low immunosuppression. Serum creatinine is the most suitable tool for monitoring kidney allograft function on every-day basis. However, protocol biopsies at 6 and 12 months predicted late kidney allograft dysfunction and affected the clinical management of the patients. Protocol biopsies are thus a suitable surrogate to be used in clinical trials and for monitoring kidney allografts.
Resumo:
Children with intellectual disability are at increased risk for emotional and behavioural problems, but many of these disturbances fail to be diagnosed. Structured checklists have been used to supplement the psychiatric assessment of children without intellectual disability, but for children with intellectual disability, only a few checklists are available. The aim of the study was to investigate psychiatric disturbances among children with intellectual disability: the prevalence, types and risk factors of psychiatric disturbances as well as the applicability of the Finnish translations of the Developmental Behaviour Checklist (DBC-P) and the Child Behavior Checklist (CBCL) in the assessment of psychopathology. The subjects comprised 155 children with intellectual disability, and data were obtained from case records and five questionnaires completed by the parents or other carers of the child. According to case records, a psychiatric disorder had previously been diagnosed in 11% of the children. Upon careful re-examination of case records, the total proportion of children with a psychiatric disorder increased to 33%. According to checklists, the frequency of probable psychiatric disorder was 34% by the DBC-P, and 43% by the CBCL. The most common diagnoses were pervasive developmental disorders and hyperkinetic disorders. The results support previous findings that compared with children without intellectual disability, the risk of psychiatric disturbances is 2-3-fold in children with intellectual disability. The risk of psychopathology was most significantly increased by moderate intellectual disability and low socio-economic status, and decreased by adaptive behaviour, language development, and socialisation as well as living with both biological parents. The results of the study suggest that both the DBC-P and the CBCL can be used to discriminate between children with intellectual disability with and without emotional or psychiatric disturbance. The DBC-P is suitable for children with any degree of intellectual disability, and the CBCL is suitable at least for children with mild intellectual disability. Because the problems of children with intellectual disability differ somewhat from those of children without intellectual disability, checklists designed specifically for children with intellectual disability are needed.
Resumo:
Vertigo in children is more common than previously thought. However, only a small fraction of affected children meet a physician. The reason for this may be the benign course of vertigo in children. Most childhood vertigo is self-limiting, and the provoking factor can often be identified. The differential diagnostic process in children with vertigo is extensive and quite challenging even for otologists and child neurologists, who are the key persons involved in treating vertiginous children. The cause of vertigo can vary from orthostatic hypotension to a brain tumor, and thus, a structured approach is essential in avoiding unnecessary examinations and achieving a diagnosis. Common forms of vertigo in children are otitis media-related dizziness, benign paroxysmal vertigo of childhood, migraine-associated dizziness, and vestibular neuronitis. Orthostatic hypotension, which is not a true vertigo, is the predominant type of dizziness in children. Vertigo is often divided according to origin into peripheral and central types. An otologist is familiar with peripheral causes, while a neurologist treats central causes. Close cooperation between different specialists is essential. Sometimes consultation with a psy-chiatrist or an ophthalmologist can lead to the correct diagnosis. The purpose of this study was to evaluate the prevalence and clinical characteristics of vertigo in children. We prospectively collected general population-based data from three schools and one child wel-fare clinic located close to Helsinki University Central Hospital (HUCH). A simple questionnaire with mostly closed questions was given to 300 consecutive children visiting the welfare clinic. At the schools, entire classes that fit the desired age groups received the questionnaire. Of the 1050 children who received the questionnaire, 938 (473 girls, 465 boys) returned it, the response rate thus being 89% (I). In Study II, we evaluated the 24 vertiginous children (15 girls, 9 boys) with true vertigo and 12 healthy age- and gender-matched controls. A detailed medical history was obtained using a structured approach, and an otoneurologic examination, including audiogram, electronystagmography, and tympanometry, was performed at the HUCH ear, nose, and throat clinic for cooperative subjects. In Study III, we reviewed and evaluated the medical records of 119 children (63 girls, 56 boys) aged 0-17 years who had visited the ear, nose, and throat clinic with a primary complaint of vertigo in 2000-2004. We also wanted information about indications for imaging of the head in vertiginous children. To this end, we reviewed the medical records of 978 children who had undergone imaging of the head for various indications. Of these, 87 children aged 0-16 years were imaged because of vertigo. Subjects of interest were the 23 vertiginous children with an acute deviant finding in magnetic resonance images or com-puterized tomography (IV). Our results indicate that vertigo and other balance problems in children are quite common. Of the HUCH area population, 8% of the children had sometimes experienced vertigo, dizziness, or balance problems. Of these 23% had vertigo sufficiently severe to stop their activity (I). The structured data collection approach eased the evaluation of vertiginous children. More headaches and head traumas were observed in vertiginous children than in healthy controls (II). The most common diagnoses of ear, nose, and throat clinic patients within the five-year period were benign paroxysmal vertigo of child-hood, migraine-associated dizziness, vestibular neuronitis, and otitis media-related vertigo. Valuable diagnostic tools in the diagnostic process were patient history and otoneurologic examinations, includ-ing audiogram, electronystagmography, and tympanometry (III). If the vertiginous child had neurologi-cal deficits, persistent headache, or preceding head trauma, imaging of the head was indicated (IV).
Resumo:
The prevalence of obesity is increasing at an alarming rate in all age groups worldwide. Obesity is a serious health problem due to increased risk of morbidity and mortality. Although environmental factors play a major role in the development of obesity, the identification of rare monogenic defects in human genes have confirmed that obesity has a strong genetic component. Mutations have been identified in genes encoding proteins of the leptin-melanocortin signaling system, which has an important role in the regulation of appetite and energy balance. The present study aimed at identifying mutations and genetic variations in the melanocortin receptors 2-5 and other genes active on the same signaling pathway accounting for severe early-onset obesity in children and morbid obesity in adults. The main achievement of this thesis was the identification of melanocortin-4 receptor (MC4R) mutations in Finnish patients. Six pathogenic MC4R mutations (308delT, P299H, two S127L and two -439delGC mutations) were identified, corresponding to a prevalence of 3% in severe early-onset obesity. No obesity causing MC4R mutations were found among patients with adult-onset morbid obesity. The MC4R 308delT deletion is predicted to result in a grossly truncated nonfunctional receptor of only 107 amino acids. The C-terminal residues, which are important in MC4R cell surface targeting, are totally absent from the mutant 308delT receptor. In vitro functional studies supported a pathogenic role for the S127L mutation since agonist induced signaling of the receptor was impaired. Cell membrane localization of the S127L receptor did not differ from that of the wild-type receptor, confirming that impaired function of the S127L receptor was due to reduced signaling properties. The P299H mutation leads to intracellular retention of the receptor. The -439delGC deletion is situated at a potential nescient helix-loop-helix 2 (NHLH2) -binding site in the MC4R promoter. It was demonstrated that the transcription factor NHLH2 binds to the consensus sequence at the -439delGC site in vitro, possibly resulting in altered promoter activity. Several genetic variants were identified in the melanocortin-3 receptor (MC3R) and pro-opiomelanocortin (POMC) genes. These polymorphisms do not explain morbid obesity, but the results indicate that some of these genetic variations may be modifying factors in obesity, resulting in subtle changes in obesity-related traits. A risk haplotype for obesity was identified in the ectonucleotide pyrophosphatase phosphodiesterase 1 (ENPP1) gene through a candidate gene single nucleotide polymorphism (SNP) genotyping approach. An ENPP1 haplotype, composed of SNPs rs1800949 and rs943003, was shown to be significantly associated with morbid obesity in adults. Accordingly, the MC3R, POMC and ENPP1 genes represent examples of susceptibility genes in which genetic variants predispose to obesity. In conclusion, pathogenic mutations in the MC4R gene were shown to account for 3% of cases with severe early-onset obesity in Finland. This is in line with results from other populations demonstrating that mutations in the MC4R gene underlie 1-6% of morbid obesity worldwide. MC4R deficiency thus represents the most common monogenic defect causing human obesity reported so far. The severity of the MC4-receptor defect appears to be associated with time of onset and the degree of obesity. Classification of MC4R mutations may provide a useful tool when predicting the outcome of the disease. In addition, several other genetic variants conferring susceptibility to obesity were detected in the MC3R, MC4R, POMC and ENPP1 genes.