961 resultados para wild population
Resumo:
The prevalence and assessment of neuroleptic-induced movement disorders (NIMDs) in a naturalistic schizophrenia population that uses conventional neuroleptics were studied. We recruited 99 chronic schizophrenic institutionalized adult patients from a state nursing home in central Estonia. The total prevalence of NIMDs according to the diagnostic criteria of the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) was 61.6%, and 22.2% had more than one NIMD. We explored the reliability and validity of different instruments for measuring these disorders. First, we compared DSM-IV with the established observer rating scales of Barnes Akathisia Rating Scale (BARS), Simpson-Angus Scale (SAS) (for neuroleptic-induced parkinsonism, NIP) and Abnormal Involuntary Movement Scale (AIMS) (for tardive dyskinesia), all three of which have been used for diagnosing NIMD. We found a good overlap of cases for neuroleptic-induced akathisia (NIA) and tardive dyskinesia (TD) but somewhat poorer overlap for NIP, for which we suggest raising the commonly used threshold value of 0.3 to 0.65. Second, we compared the established observer rating scales with an objective motor measurement, namely controlled rest lower limb activity measured by actometry. Actometry supported the validity of BARS and SAS, but it could not be used alone in this naturalistic population with several co-existing NIMDs. It could not differentiate the disorders from each other. Quantitative actometry may be useful in measuring changes in NIA and NIP severity, in situations where the diagnosis has been made using another method. Third, after the relative failure of quantitative actometry to show diagnostic power in a naturalistic population, we explored descriptive ways of analysing actometric data, and demonstrated diagnostic power pooled NIA and pseudoakathisia (PsA) in our population. A subjective question concerning movement problems was able to discriminate NIA patients from all other subjects. Answers to this question were not selective for other NIMDs. Chronic schizophrenia populations are common worldwide, NIMD affected two-thirds of our study population. Prevention, diagnosis and treatment of NIMDs warrant more attention, especially in countries where typical antipsychotics are frequently used. Our study supported the validity and reliability of DSM-IV diagnostic criteria for NIMD in comparison with established rating scales and actometry. SAS can be used with minor modifications for screening purposes. Controlled rest lower limb actometry was not diagnostically specific in our naturalistic population with several co-morbid NIMDs, but it may be sensitive in measuring changes in NIMDs.
Resumo:
Background Risk-stratification of diffuse large B-cell lymphoma (DLBCL) requires identification of patients with disease that is not cured despite initial R-CHOP. Although the prognostic importance of the tumour microenvironment (TME) is established, the optimal strategy to quantify it is unknown. Methods The relationship between immune-effector and inhibitory (checkpoint) genes was assessed by NanoString™ in 252 paraffin-embedded DLBCL tissues. A model to quantify net anti-tumoural immunity as an outcome predictor was tested in 158 R-CHOP treated patients, and validated in tissue/blood from two independent R-CHOP treated cohorts of 233 and 140 patients respectively. Findings T and NK-cell immune-effector molecule expression correlated with tumour associated macrophage and PD-1/PD-L1 axis markers consistent with malignant B-cells triggering a dynamic checkpoint response to adapt to and evade immune-surveillance. A tree-based survival model was performed to test if immune-effector to checkpoint ratios were prognostic. The CD4*CD8:(CD163/CD68)*PD-L1 ratio was better able to stratify overall survival than any single or combination of immune markers, distinguishing groups with disparate 4-year survivals (92% versus 47%). The immune ratio was independent of and added to the revised international prognostic index (R-IPI) and cell-of-origin (COO). Tissue findings were validated in 233 DLBCL R-CHOP treated patients. Furthermore, within the blood of 140 R-CHOP treated patients immune-effector:checkpoint ratios were associated with differential interim-PET/CT+ve/-ve expression.
Resumo:
Breast cancer is the most commonly occurring cancer among women, and its incidence is increasing worldwide. Positive family history is a well established risk factor for breast cancer, and it is suggested that the proportion of breast cancer that can be attributed to genetic factors may be as high as 30%. However, all the currently known breast cancer susceptibility genes are estimated to account for 20-30% of familial breast cancer, and only 5% of the total breast cancer incidence. It is thus likely that there are still other breast cancer susceptibility genes to be found. Cellular responses to DNA damage are crucial for maintaining genomic integrity and preventing the development of cancer. The genes operating in DNA damage response signaling network are thus good candidates for breast cancer susceptibility genes. The aim of this study was to evaluate the role of three DNA damage response associated genes, ATM, RAD50, and p53, in breast cancer. ATM, a gene causative for ataxia telangiectasia (A-T), has long been a strong candidate for a breast cancer susceptibility gene because of its function as a key DNA damage signal transducer. We analyzed the prevalence of known Finnish A-T related ATM mutations in large series of familial and unselected breast cancer cases from different geographical regions in Finland. Of the seven A-T related mutations, two were observed in the studied familial breast cancer patients. Additionally, a third mutation previously associated with breast cancer susceptibility was also detected. These founder mutations may be responsible for excess familial breast cancer regionally in Northern and Central Finland, but in Southern Finland our results suggest only a minor effect, if any, of any ATM genetic variants on familial breast cancer. We also screened the entire coding region of the ATM gene in 47 familial breast cancer patients from Southern Finland, and evaluated the identified variants in additional cases and controls. All the identified variants were too rare to significantly contribute to breast cancer susceptibility. However, the role of ATM in cancer development and progression was supported by the results of the immunohistochemical studies of ATM expression, as reduced ATM expression in breast carcinomas was found to correlate with tumor differentiation and hormone receptor status. Aberrant ATM expression was also a feature shared by the BRCA1/2 and the difficult-to-treat ER/PR/ERBB2-triple-negative breast carcinomas. From the clinical point of view, identification of phenotypic and genetic similarities between the BRCA1/2 and the triple-negative breast tumors could have an implication in designing novel targeted therapies to which both of these classes of breast cancer might be exceptionally sensitive. Mutations of another plausible breast cancer susceptibility gene, RAD50, were found to be very rare, and RAD50 can only be making a minor contribution to familial breast cancer predisposition in UK and Southern Finland. The Finnish founder mutation RAD50 687delT seems to be a null allele and may carry a small increased risk of breast cancer. RAD50 is not acting as a classical tumor suppressor gene, but it is possible that RAD50 haploinsufficiency is contributing to cancer. In addition to relatively rare breast cancer susceptibility alleles, common polymorphisms may also be associated with increased breast cancer risk. Furthermore, these polymorphisms may have an impact on the progression and outcome of the disease. Our results suggest no effect of the common p53 R72P polymorphism on familial breast cancer risk or breast cancer risk in the population, but R72P seems to be associated with histopathologic features of the tumors and survival of the patients; 72P homozygous genotype was an independent prognostic factor among the unselected breast cancer patients, with a two-fold increased risk of death. These results present important novel findings also with clinical significance, as codon 72 genotype could be a useful additional prognostic marker in breast cancer, especially among the subgroup of patients with wild-type p53 in their tumors.
Resumo:
A randomised and population-based screening design with new technologies has been applied to the organised cervical cancer screening programme in Finland. In this experiment the women invited to routine five-yearly screening are individually randomised to be screened with automation-assisted cytology, human papillomavirus (HPV) test or conventional cytology. By using the randomised design, the ultimate aim is to assess and compare the long-term outcomes of the different screening regimens. The primary aim of the current study was to evaluate, based on the material collected during the implementation phase of the Finnish randomised screening experiment, the cross-sectional performance and validity of automation-assisted cytology (Papnet system) and primary HPV DNA testing (Hybrid Capture II assay for 13 oncogenic HPV types) within service screening, in comparison to conventional cytology. The parameters of interest were test positivity rate, histological detection rate, relative sensitivity, relative specificity and positive predictive value. Also, the effect of variation in performance by screening laboratory on age-adjusted cervical cancer incidence was assessed. Based on the cross-sectional results, almost no differences were observed in the performance of conventional and automation-assisted screening. Instead, primary HPV screening found 58% (95% confidence interval 19-109%) more cervical lesions than conventional screening. However, this was mainly due to overrepresentation of mild- and moderate-grade lesions and, thus, is likely to result in overtreatment since a great deal of these lesions would never progress to invasive cancer. Primary screening with an HPV DNA test alone caused substantial loss in specificity in comparison to cytological screening. With the use of cytology triage test, the specificity of HPV screening improved close to the level of conventional cytology. The specificity of primary HPV screening was also increased by increasing the test positivity cutoff from the level recommended for clinical use, but the increase was more modest than the one gained with the use of cytology triage. The performance of the cervical cancer screening programme varied widely between the screening laboratories, but the variation in overall programme effectiveness between respective populations was more marginal from the very beginning of the organised screening activity. Thus, conclusive interpretations on the quality or success of screening should not be based on performance parameters only. In the evaluation of cervical cancer screening the outcome should be selected as closely as possible to the true measure of programme effectiveness, which is the number of invasive cervical cancers and subsequent deaths prevented in the target population. The evaluation of benefits and adverse effects of each new suggested screening technology should be performed before the technology becomes an accepted routine in the existing screening programme. At best, the evaluation is performed randomised, within the population and screening programme in question, which makes the results directly applicable to routine use.
Resumo:
Dendritic cells (DC) efficiently phagocytose invading bacteria, but fail to kill intracellular pathogens such as Salmonella enterica serovar Typhimurium (S. Typhimurium). We analysed the intracellular fate of Salmonella in murine bone marrow-derived DC (BM-DC). The intracellular proliferation and subcellular localization were investigated for wild-type S. Typhimurium and mutants deficient in Salmonella pathogenicity island 2 (SPI2), a complex virulence factor that is essential for systemic infections in the murine model and intracellular survival and replication in macrophages. Using a segregative plasmid to monitor intracellular cell division, we observed that, in BM-DC, S. Typhimurium represents a static, non-dividing population. In BM-DC, S. Typhimurium resides in a membrane-bound compartment that has acquired late endosomal markers. However, these bacteria respond to intracellular stimuli, because induction of SPI2 genes was observed. S. Typhimurium within DC are also able to translocate a virulence protein into their host cells. SPI2 function was not required for intracellular survival in DC, but we observed that the maturation of the Salmonella-containing vesicle is different in DC infected with wild-type bacteria and a strain deficient in SPI2. Our observations indicate that S. Typhimurium in DC are able to modify normal processes of their host cells.
Resumo:
Spirometry is the most widely used lung function test in the world. It is fundamental in diagnostic and functional evaluation of various pulmonary diseases. In the studies described in this thesis, the spirometric assessment of reversibility of bronchial obstruction, its determinants, and variation features are described in a general population sample from Helsinki, Finland. This study is a part of the FinEsS study, which is a collaborative study of clinical epidemiology of respiratory health between Finland (Fin), Estonia (Es), and Sweden (S). Asthma and chronic obstructive pulmonary disease (COPD) constitute the two major obstructive airways diseases. The prevalence of asthma has increased, with around 6% of the population in Helsinki reporting physician-diagnosed asthma. The main cause of COPD is smoking with changes in smoking habits in the population affecting its prevalence with a delay. Whereas airway obstruction in asthma is by definition reversible, COPD is characterized by fixed obstruction. Cough and sputum production, the first symptoms of COPD, are often misinterpreted for smokers cough and not recognized as first signs of a chronic illness. Therefore COPD is widely underdiagnosed. More extensive use of spirometry in primary care is advocated to focus smoking cessation interventions on populations at risk. The use of forced expiratory volume in six seconds (FEV6) instead of forced vital capacity (FVC) has been suggested to enable office spirometry to be used in earlier detection of airflow limitation. Despite being a widely accepted standard method of assessment of lung function, the methodology and interpretation of spirometry are constantly developing. In 2005, the ATS/ERS Task Force issued a joint statement which endorsed the 12% and 200 ml thresholds for significant change in forced expiratory volume in one second (FEV1) or FVC during bronchodilation testing, but included the notion that in cases where only FVC improves it should be verified that this is not caused by a longer exhalation time in post-bronchodilator spirometry. This elicited new interest in the assessment of forced expiratory time (FET), a spirometric variable not usually reported or used in assessment. In this population sample, we examined FET and found it to be on average 10.7 (SD 4.3) s and to increase with ageing and airflow limitation in spirometry. The intrasession repeatability of FET was the poorest of the spirometric variables assessed. Based on the intrasession repeatability, a limit for significant change of 3 s was suggested for FET during bronchodilation testing. FEV6 was found to perform equally well as FVC in the population and in a subgroup of subjects with airways obstruction. In the bronchodilation test, decreases were frequently observed in FEV1 and particularly in FVC. The limit of significant increase based on the 95th percentile of the population sample was 9% for FEV1 and 6% for FEV6 and FVC; these are slightly lower than the current limits for single bronchodilation tests (ATS/ERS guidelines). FEV6 was proven as a valid alternative to FVC also in the bronchodilation test and would remove the need to control duration of exhalation during the spirometric bronchodilation test.
Resumo:
The purpose of this study was to estimate the prevalence and distribution of reduced visual acuity, major chronic eye diseases, and subsequent need for eye care services in the Finnish adult population comprising persons aged 30 years and older. In addition, we analyzed the effect of decreased vision on functioning and need for assistance using the World Health Organization’s (WHO) International Classification of Functioning, Disability, and Health (ICF) as a framework. The study was based on the Health 2000 health examination survey, a nationally representative population-based comprehensive survey of health and functional capacity carried out in 2000 to 2001 in Finland. The study sample representing the Finnish population aged 30 years and older was drawn by a two-stage stratified cluster sampling. The Health 2000 survey included a home interview and a comprehensive health examination conducted at a nearby screening center. If the invited participants did not attend, an abridged examination was conducted at home or in an institution. Based on our finding in participants, the great majority (96%) of Finnish adults had at least moderate visual acuity (VA ≥ 0.5) with current refraction correction, if any. However, in the age group 75–84 years the prevalence decreased to 81%, and after 85 years to 46%. In the population aged 30 years and older, the prevalence of habitual visual impairment (VA ≤ 0.25) was 1.6%, and 0.5% were blind (VA < 0.1). The prevalence of visual impairment increased significantly with age (p < 0.001), and after the age of 65 years the increase was sharp. Visual impairment was equally common for both sexes (OR 1.20, 95% CI 0.82 – 1.74). Based on self-reported and/or register-based data, the estimated total prevalences of cataract, glaucoma, age-related maculopathy (ARM), and diabetic retinopathy (DR) in the study population were 10%, 5%, 4%, and 1%, respectively. The prevalence of all of these chronic eye diseases increased with age (p < 0.001). Cataract and glaucoma were more common in women than in men (OR 1.55, 95% CI 1.26 – 1.91 and OR 1.57, 95% CI 1.24 – 1.98, respectively). The most prevalent eye diseases in people with visual impairment (VA ≤ 0.25) were ARM (37%), unoperated cataract (27%), glaucoma (22%), and DR (7%). One-half (58%) of visually impaired people had had a vision examination during the past five years, and 79% had received some vision rehabilitation services, mainly in the form of spectacles (70%). Only one-third (31%) had received formal low vision rehabilitation (i.e., fitting of low vision aids, receiving patient education, training for orientation and mobility, training for activities of daily living (ADL), or consultation with a social worker). People with low vision (VA 0.1 – 0.25) were less likely to have received formal low vision rehabilitation, magnifying glasses, or other low vision aids than blind people (VA < 0.1). Furthermore, low cognitive capacity and living in an institution were associated with limited use of vision rehabilitation services. Of the visually impaired living in the community, 71% reported a need for assistance and 24% had an unmet need for assistance in everyday activities. Prevalence of ADL, instrumental activities of daily living (IADL), and mobility increased with decreasing VA (p < 0.001). Visually impaired persons (VA ≤ 0.25) were four times more likely to have ADL disabilities than those with good VA (VA ≥ 0.8) after adjustment for sociodemographic and behavioral factors and chronic conditions (OR 4.36, 95% CI 2.44 – 7.78). Limitations in IADL and measured mobility were five times as likely (OR 4.82, 95% CI 2.38 – 9.76 and OR 5.37, 95% CI 2.44 – 7.78, respectively) and self-reported mobility limitations were three times as likely (OR 3.07, 95% CI 1.67 – 9.63) as in persons with good VA. The high prevalence of age-related eye diseases and subsequent visual impairment in the fastest growing segment of the population will result in a substantial increase in the demand for eye care services in the future. Many of the visually impaired, especially older persons with decreased cognitive capacity or living in an institution, have not had a recent vision examination and lack adequate low vision rehabilitation. This highlights the need for regular evaluation of visual function in the elderly and an active dissemination of information about rehabilitation services. Decreased VA is strongly associated with functional limitations, and even a slight decrease in VA was found to be associated with limited functioning. Thus, continuous efforts are needed to identify and treat eye diseases to maintain patients’ quality of life and to alleviate the social and economic burden of serious eye diseases.
Resumo:
A new framework is proposed in this work to solve multidimensional population balance equations (PBEs) using the method of discretization. A continuous PBE is considered as a statement of evolution of one evolving property of particles and conservation of their n internal attributes. Discretization must therefore preserve n + I properties of particles. Continuously distributed population is represented on discrete fixed pivots as in the fixed pivot technique of Kumar and Ramkrishna [1996a. On the solution of population balance equation by discretization-I A fixed pivot technique. Chemical Engineering Science 51(8), 1311-1332] for 1-d PBEs, but instead of the earlier extensions of this technique proposed in the literature which preserve 2(n) properties of non-pivot particles, the new framework requires n + I properties to be preserved. This opens up the use of triangular and tetrahedral elements to solve 2-d and 3-d PBEs, instead of the rectangles and cuboids that are suggested in the literature. Capabilities of computational fluid dynamics and other packages available for generating complex meshes can also be harnessed. The numerical results obtained indeed show the effectiveness of the new framework. It also brings out the hitherto unknown role of directionality of the grid in controlling the accuracy of the numerical solution of multidimensional PBEs. The numerical results obtained show that the quality of the numerical solution can be improved significantly just by altering the directionality of the grid, which does not require any increase in the number of points, or any refinement of the grid, or even redistribution of pivots in space. Directionality of a grid can be altered simply by regrouping of pivots.
Resumo:
Eighty-five new cases of conjunctival melanoma (CM) were diagnosed in Finland between 1967 and 2000. The annual crude incidence of CM was 0.51 per million inhabitants. The average age-adjusted incidence of 0.54 doubled during the study period, analogous to the increase in the incidence of cutaneous malignant melanoma during this period, suggesting a possible role for ultraviolet radiation in its pathogenesis. Nonlimbal tumors were more likely than limbal ones to recur and they were associated with decreased survival. Increasing tumor thickness and recurrence of the primary tumor were other clinical factors related to death from CM. The histopathologic specimens of 85 patients with CM melanoma were studied for cell type, mitotic count, tumor-infiltrating lymphocytes and macrophages, mean vascular density, extravascular matrix loops and networks, and mean diameter of the ten largest nucleoli (MLN). The absence of epithelioid cells, increasing mitotic count and small MLN were associated with shorter time to recurrence according to the Cox univariate regression. None of the histopathologic variables was associated with mortality from CM. Four (5%) patients had a CM limited to the cornea without evidence of a tumor other than primary acquired melanosis of the conjunctiva. Because there are no melanocytes in the cornea, the origin of these melanomas most likely is the limbal conjunctiva. All four corneally displaced CM were limited to the epithelium, and none of the patients developed metastases. An anatomic sub-classification based on my patients and world literature was developed for corneally displaced CM. In 20 patients the metastatic pattern could be determined. Ten patients had initial systemic metastases detected, nine had initial regional metastases, and in one case the two types were detected simultaneously. The patients most likely to develop either type of initial metastases were those with nonlimbal conjunctival melanoma, those with a primary tumor more than 2 mm thick, and those with a recurrent conjunctival melanoma. Approximately two thirds of the patients had limbal CM, a location associated with good prognosis. One third, however, had a primary CM originating outside the limbus. In these patients the chance of developing local recurrences as well as systemic metastases was significantly higher than in patients with limbal CM. Each recurrence accompanies an increased risk of developing metastases, and recurrences contribute to death along with increasing tumor thickness and nonlimbal tumor location. In my data, an equal number of patients with initial locoregional and systemic metastasis existed. Patients with limbal primary tumors less than 2 mm in thickness rarely experienced metastases, unless the tumor recurred. Consequently, the patients most likely to benefit from sentinel lymph node biopsy are those who have nonlimbal tumors, CM that are over 2 mm thick, or recurrent CM. The histopathology of CM differs from that of uveal melanoma. Microvascular factors did not prove to be of prognostic importance, possibly due to the fact that CM at least as often disseminates first to the regional lymph nodes, unlike uveal melanoma that almost always disseminates hematogenously.
Resumo:
Background and aims. Type 1 diabetes (T1D), an autoimmune disease in which the insulin producing beta cells are gradually destroyed, is preceded by a prodromal phase characterized by appearance of diabetes-associated autoantibodies in circulation. Both the timing of the appearance of autoantibodies and their quality have been used in the prediction of T1D among first-degree relatives of diabetic patients (FDRs). So far, no general strategies for identifying individuals at increased disease risk in the general population have been established, although the majority of new cases originate in this population. The current work aimed at assessing the predictive role of diabetes-associated immunologic and metabolic risk factors in the general population, and comparing these factors with data obtained from studies on FDRs. Subjects and methods. Study subjects in the current work were subcohorts of participants of the Childhood Diabetes in Finland Study (DiMe; n=755), the Cardiovascular Risk in Young Finns Study (LASERI; n=3475), and the Finnish Type 1 Diabetes Prediction and Prevention Study (DIPP) Study subjects (n=7410). These children were observed for signs of beta-cell autoimmunity and progression to T1D, and the results obtained were compared between the FDRs and the general population cohorts. --- Results and conclusions. By combining HLA and autoantibody screening, T1D risks similar to those reported for autoantibody-positive FDRs are observed in the pediatric general population. Progression rate to T1D is high in genetically susceptible children with persistent multipositivity. Measurement of IAA affinity failed in stratifying the risk assessment in young IAA-positive children with HLA-conferred disease susceptibility, among whom affinity of IAA did not increase during the prediabetic period. Young age at seroconversion, increased weight-for-height, decreased early insulin response, and increased IAA and IA-2A levels predict T1D in young children with genetic disease susceptibility and signs of advanced beta-cell autoimmunity. Since the incidence of T1D continues to increase, efforts aimed at preventing T1D are important, and reliable disease prediction is needed both for intervention trials and for effective and safe preventive therapies in the future. Our observations confirmed that combined HLA-based screening and regular autoantibody measurements reveal similar disease risks in pediatric general population as those seen in prediabetic FDRs, and that risk assessment can be stratified further by studying glucose metabolism of prediabetic subjects. As these screening efforts are feasible in practice, the knowledge now obtained can be exploited while designing intervention trials aimed at secondary prevention of T1D.
Resumo:
Background: Irritable bowel syndrome (IBS) is a common functional gastrointestinal (GI) disorder characterised by abdominal pain and abnormal bowel function. It is associated with a high rate of healthcare consumption and significant health care costs. The prevalence and economic burden of IBS in Finland has not been studied before. The aims of this study were to assess the prevalence of IBS according to various diagnostic criteria and to study the rates of psychiatric and somatic comorbidity in IBS. In addition, health care consumption and societal costs of IBS were to be evaluated. Methods: The study was a two-phase postal survey. Questionnaire I identifying IBS by Manning 2 (at least two of the six Manning symptoms), Manning 3 (at least three Manning symptoms), Rome I, and Rome II criteria, was mailed to a random sample of 5 000 working age subjects. It also covered extra-GI symptoms such as headache, back pain, and depression. Questionnaire II, covering rates of physician visits, and use of GI medication, was sent to subjects fulfilling Manning 2 or Rome II IBS criteria in Questionnaire I. Results: The response rate was 73% and 86% for questionnaires I and II. The prevalence of IBS was 15.9%, 9.6%, 5.6%, and 5.1% according to Manning 2, Manning 3, Rome I, and Rome II criteria. Of those meeting Rome II criteria, 97% also met Manning 2 criteria. Presence of severe abdominal pain was more often reported by subjects meeting either of the Rome criteria than those meeting either of the Manning criteria. Presence of depression, anxiety, and several somatic symptoms was more common among subjects meeting any IBS criterion than by controls. Of subjects with depressive symptoms, 11.6% met Rome II IBS criteria compared to 3.7% of those with no depressiveness. Subjects meeting any IBS criteria made more physician visits than controls. Intensity of GI symptoms and presence of dyspeptic symptoms were the strongest predictors of GI consultations. Presence of dyspeptic symptoms and a history of abdominal pain in childhood also predicted non-GI visits. Annual GI related individual costs were higher in the Rome II group (497 ) than in the Manning 2 group (295 ). Direct expenses of GI symptoms and non GI physician visits ranged between 98M for Rome II and 230M for Manning 2 criteria. Conclusions: The prevalence of IBS varies substantially depending on the criteria applied. Rome II criteria are more restrictive than Manning 2, and they identify an IBS population with more severe GI symptoms, more frequent health care use, and higher individual health care costs. Subjects with IBS demonstrate high rates of psychiatric and somatic comorbidity regardless of health care seeking status. Perceived symptom severity rather than psychiatric comorbidity predicts health care seeking for GI symptoms. IBS incurs considerable medical costs. The direct GI and non-GI costs are equivalent to up to 5% of outpatient health care and medicine costs in Finland. A more integral approach to IBS by physicians, accounting also for comorbid conditions, may produce a more favourable course in IBS patients and reduce health care expenditures.
Resumo:
The purpose of this study was to compare the neuropsychological performance of two frontal dysexecutive phenotypes - disinhibited&' syndrome (DS) and &'apathetic&' syndrome (AS) following a traumatic brain injury in a non-western population, Oman. Methods: The study compared the performance of DS and AS in neuropsychological measures including those tapping into verbal reasoning ability/working memory/attention planning/goal-directed behavior and affective ranges. Results: The present analysis showed that DS and AS participants did not differ on indices measuring working memory/attention and affective ranges. However, the two cohorts differed significantly in measures of planning/goal-directed behaviour. Conclusion: This study lays the groundwork for further scrutiny in delineating the different characteristics of what has been previously labelled as frontal dysexecutive phenotype. This study indicates that DS and AS are marked with specific neuropsychological deficits.
Resumo:
Kohonneiden kolesterolipitoisuuksien alentamisessa käytettävien statiinien hyödyt sydän- ja verisuonisairauksien estossa on vahvasti osoitettu ja niiden käyttö on niin Suomessa kuin muuallakin maailmassa kasvanut voimakkaasti – Suomessa statiininkäyttäjiä on noin 600 000. Statiinilääkitys on pitkäaikaisessakin käytössä melko hyvin siedetty, mutta yleisimpinä haittavaikutuksina voi ilmetä lihasheikkoutta, -kipua ja -kramppeja, jotka voivat edetä jopa henkeä uhkaavaksi lihasvaurioksi. Lihashaittariski suurenee suhteessa statiiniannokseen ja plasman statiinipitoisuuksiin. Statiinien plasmapitoisuuksissa, tehossa ja haittavaikutusten ilmenemisessä on suuria potilaskohtaisia eroja. SLCO1B1-geenin koodaama OATP1B1-kuljetusproteiini kuljettaa monia elimistön omia aineita ja lääkeaineita verenkierrosta solukalvon läpi maksasoluun, mm. statiineja, joiden kolesterolia alentava vaikutus ja poistuminen elimistöstä tapahtuvat pääosin maksassa. Erään SLCO1B1-geenin nukleotidimuutoksen (c.521T>C) tiedetään heikentävän OATP1B1:n kuljetustehoa. Tässä väitöskirjatyössä selvitettiin SLCO1B1-geenin perinnöllistä muuntelua suomalaisilla ja eri väestöissä maailmanlaajuisesti. Lisäksi selvitettiin SLCO1B1:n muunnosten vaikutusta eri statiinien pitoisuuksiin (farmakokinetiikka) ja vaikutuksiin (farmakodynamiikka) sekä kolesteroliaineenvaihduntaan. Näihin tutkimuksiin valittiin SLCO1B1-genotyypin perusteella terveitä vapaaehtoisia koehenkilöitä, joille annettiin eri päivinä kerta-annos kutakin tutkittavaa statiinia: fluvastatiinia, pravastatiinia, simvastatiinia, rosuvastatiinia ja atorvastatiinia. Verinäytteistä määritettiin plasman statiinien ja niiden aineenvaihduntatuotteiden sekä kolesterolin ja sen muodostumista ja imeytymistä kuvaavien merkkiaineiden pitoisuuksia. Toiminnallisesti merkittävien SLCO1B1-geenimuunnosten esiintyvyydessä todettiin suuria eroja eri väestöjen välillä. Suomalaisilla SLCO1B1 c.521TC-genotyypin (geenimuunnos toisessa vastinkromosomissa) esiintyvyys oli noin 32 % ja SLCO1B1 c.521CC-genotyypin (geenimuunnos molemmissa vastinkromosomeissa) esiintyvyys noin 4 %. Globaalisti geenimuunnosten esiintyvyys korreloi maapallon leveyspiirien kanssa siten, että matalaan transportteriaktiivisuuteen johtavat muunnokset olivat yleisimpiä pohjoisessa ja korkeaan aktiivisuuteen johtavat päiväntasaajan lähellä asuvilla väestöillä. SLCO1B1-genotyypillä oli merkittävä vaikutus statiinien plasmapitoisuksiin lukuun ottamatta fluvastatiinia. Simvastatiinihapon plasmapitoisuudet olivat keskimäärin 220 %, atorvastatiinin 140 %, pravastatiinin 90 % ja rosuvastatiinin 70 % suuremmat c.521CC-genotyypin omaavilla koehenkilöillä verrattuna normaalin c.521TT-genotyypin omaaviin. Genotyypillä ei ollut merkittävää vaikutusta minkään statiinin tehoon tässä kerta-annostutkimuksessa, mutta geenimuunnoksen kantajilla perustason kolesterolisynteesinopeus oli suurempi. Tulokset osoittavat, että SLCO1B1 c.521T>C geenimuunnos on varsin yleinen suomalaisilla ja muilla ei-afrikkalaisilla väestöillä. Tämä geenimuunnos voi altistaa erityisesti simvastatiinin, mutta myös atorvastatiinin, pravastatiinin ja rosuvastatiinin, aiheuttamille lihashaitoille suurentamalla niiden plasmapitoisuuksia. SLCO1B1:n geenimuunnoksen testaamista voidaan tulevaisuudessa käyttää apuna valittaessa sopivaa statiinilääkitystä ja -annosta potilaalle, ja näin parantaa sekä statiinihoidon turvallisuutta että tehoa.
Resumo:
Cyclosporine is an immunosuppressant drug with a narrow therapeutic index and large variability in pharmacokinetics. To improve cyclosporine dose individualization in children, we used population pharmacokinetic modeling to study the effects of developmental, clinical, and genetic factors on cyclosporine pharmacokinetics in altogether 176 subjects (age range: 0.36–20.2 years) before and up to 16 years after renal transplantation. Pre-transplantation test doses of cyclosporine were given intravenously (3 mg/kg) and orally (10 mg/kg), on separate occasions, followed by blood sampling for 24 hours (n=175). After transplantation, in a total of 137 patients, cyclosporine concentration was quantified at trough, two hours post-dose, or with dose-interval curves. One-hundred-four of the studied patients were genotyped for 17 putatively functionally significant sequence variations in the ABCB1, SLCO1B1, ABCC2, CYP3A4, CYP3A5, and NR1I2 genes. Pharmacokinetic modeling was performed with the nonlinear mixed effects modeling computer program, NONMEM. A 3-compartment population pharmacokinetic model with first order absorption without lag-time was used to describe the data. The most important covariate affecting systemic clearance and distribution volume was allometrically scaled body weight i.e. body weight**3/4 for clearance and absolute body weight for volume of distribution. The clearance adjusted by absolute body weight declined with age and pre-pubertal children (< 8 years) had an approximately 25% higher clearance/body weight (L/h/kg) than did older children. Adjustment of clearance for allometric body weight removed its relationship to age after the first year of life. This finding is consistent with a gradual reduction in relative liver size towards adult values, and a relatively constant CYP3A content in the liver from about 6–12 months of age to adulthood. The other significant covariates affecting cyclosporine clearance and volume of distribution were hematocrit, plasma cholesterol, and serum creatinine, explaining up to 20%–30% of inter-individual differences before transplantation. After transplantation, their predictive role was smaller, as the variations in hematocrit, plasma cholesterol, and serum creatinine were also smaller. Before transplantation, no clinical or demographic covariates were found to affect oral bioavailability, and no systematic age-related changes in oral bioavailability were observed. After transplantation, older children receiving cyclosporine twice daily as the gelatine capsule microemulsion formulation had an about 1.25–1.3 times higher bioavailability than did the younger children receiving the liquid microemulsion formulation thrice daily. Moreover, cyclosporine oral bioavailability increased over 1.5-fold in the first month after transplantation, returning thereafter gradually to its initial value in 1–1.5 years. The largest cyclosporine doses were administered in the first 3–6 months after transplantation, and thereafter the single doses of cyclosporine were often smaller than 3 mg/kg. Thus, the results suggest that cyclosporine displays dose-dependent, saturable pre-systemic metabolism even at low single doses, whereas complete saturation of CYP3A4 and MDR1 (P-glycoprotein) renders cyclosporine pharmacokinetics dose-linear at higher doses. No significant associations were found between genetic polymorphisms and cyclosporine pharmacokinetics before transplantation in the whole population for which genetic data was available (n=104). However, in children older than eight years (n=22), heterozygous and homozygous carriers of the ABCB1 c.2677T or c.1236T alleles had an about 1.3 times or 1.6 times higher oral bioavailability, respectively, than did non-carriers. After transplantation, none of the ABCB1 SNPs or any other SNPs were found to be associated with cyclosporine clearance or oral bioavailability in the whole population, in the patients older than eight years, or in the patients younger than eight years. In the whole population, in those patients carrying the NR1I2 g.-25385C–g.-24381A–g.-205_-200GAGAAG–g.7635G–g.8055C haplotype, however, the bioavailability of cyclosporine was about one tenth lower, per allele, than in non-carriers. This effect was significant also in a subgroup of patients older than eight years. Furthermore, in patients carrying the NR1I2 g.-25385C–g.-24381A–g.-205_-200GAGAAG–g.7635G–g.8055T haplotype, the bioavailability was almost one fifth higher, per allele, than in non-carriers. It may be possible to improve individualization of cyclosporine dosing in children by accounting for the effects of developmental factors (body weight, liver size), time after transplantation, and cyclosporine dosing frequency/formulation. Further studies are required on the predictive value of genotyping for individualization of cyclosporine dosing in children.