73 resultados para forensic patients
Resumo:
DNA ja siinä sijaitsevat geenit ohjaavat kaikkea solujen toimintaa. DNA-molekyyleihin kuitenkin kertyy mutaatioita sekä ympäristön vaikutuksen, että solujen oman toiminnan tuloksena. Mikäli virheitä ei korjata, saattaa tuloksena olla solun muuttuminen syöpäsoluksi. Soluilla onkin käytössä useita DNA-virheiden korjausmekanismeja, joista yksi on ns. mismatch repair (MMR). MMR vastaa DNA:n kahdentumisessa syntyvien virheiden korjauksesta. Periytyvät mutaatiot geeneissä, jotka vastaavat MMR-proteiinien rakentamisesta, aiheuttavat ongelmia DNA:n korjauksessa ja altistavat kantajansa periytyvälle ei-polypoottiselle paksusuolisyöpäoireyhtymälle (hereditary nonpolyposis colorectal cancer, HNPCC). Yleisimmin mutatoituneet MMR-geenit ovat MLH1 ja MSH2. HNPCC periytyy vallitsevasti, eli jo toiselta vanhemmalta peritty geenivirhe altistaa syövälle. MMR-geenivirheen kantaja sairastuu syöpään elämänsä aikana suurella todennäköisyydellä, ja sairastumisikä on vain noin 40 vuotta. Syövälle altistavan geenivirheen löytäminen mutaation kantajilta on hyvin tärkeää, sillä säännöllinen seuranta mahdollistaa kehittymässä olevan kasvaimen havaitsemisen ja poistamisen jo aikaisessa vaiheessa. Tämän on osoitettu alentavan syöpäkuolleisuutta merkittävästi. Varma tieto altistuksen alkuperästä on tärkeä myös niille syöpäsuvun jäsenille, jotka eivät kanna kyseistä mutaatiota. Syövälle altistavien mutaatioiden ohella MMR-geeneistä löydetään säännöllisesti muutoksia, jotka ovat normaalia henkilöiden välistä geneettistä vaihtelua, eikä niiden oleteta lisäävän syöpäaltistusta. Altistavien mutaatioiden erottaminen näistä neutraaleista variaatioista on vaikeaa, mutta välttämätöntä altistuneiden tehokkaan seurannan varmistamiseksi. Tässä väitöskirjassa tutkittiin 18:a MSH2 -geenin mutaatiota. Mutaatiot oli löydetty perheistä, joissa esiintyi paljon syöpiä, mutta niiden vaikutus DNA:n korjaustehoon ja syöpäaltistukseen oli epäselvä. Työssä tutkittiin kunkin mutaation vaikutusta MSH2-proteiinin normaaliin toimintaan, ja tuloksia verrattiin potilaiden ja sukujen kliinisiin tietoihin. Tutkituista mutaatiosta 12 aiheutti puutteita MMR-korjauksessa. Nämä mutaatiot tulkittiin syövälle altistaviksi. Analyyseissä normaalisti toimineet 4 mutaatiota eivät todennäköisesti ole syynä syövän syntyyn kyseisillä perheillä. Tulkinta jätettiin avoimeksi 2 mutaation kohdalla. Tutkimuksesta hyötyivät suoraan kuvattujen mutaatioiden kantajaperheet, joiden geenivirheen syöpäaltistuksesta saatiin tietoa, mahdollistaen perinnöllisyysneuvonnan ja seurannan kohdentamisen sitä tarvitseville. Työ selvensi myös mekanismeja, joilla mutatoitunut MSH2-proteiini voi menettää toimintakykynsä.
Resumo:
For optimal treatment planning, a thorough assessment of the metastatic status of mucosal squamous cell carcinoma of the head and neck (HNSCC) is required. Current imaging methods do not allow the recognition of all patients with metastatic disease. Therefore, elective treatment of the cervical lymph nodes is usually given to patients in whom the risk of subclinical metastasis is estimated to exceed 15-20%. The objective of this study was to improve the pre-treatment evaluation of patients diagnosed with HNSCC. Particularly, we aimed at improving the identification of patients who will benefit from elective neck treatment. Computed tomography (CT) of the chest and abdomen was performed prospectively for 100 patients diagnosed with HNSCC. The findings were analysed to clarify the indications for this examination in this patient group. CT of the chest influenced the treatment approach in 3% of patients, while CT of the abdomen did not reveal any significant findings. Our results suggest that CT of the chest and abdomen is not indicated routinely for patients with newly diagnosed HNSCC but can be considered in selected cases. Retrospective analysis of 80 patients treated for early stage squamous cell carcinoma of the oral tongue was performed to investigate the potential benefits of elective neck treatment and to examine whether histopathological features of the primary tumour could be used in the prediction of occult metastases, local recurrence, or/and poor survival. Patients who had received elective neck treatment had significantly fewer cervical recurrences during the follow-up when compared to those who only had close observation of the cervical lymph nodes. Elective neck treatment did not result in survival benefit, however. Of the histopathological parameters examined, depth of infiltration and pT-category (representing tumour diameter) predicted occult cervical metastasis, but only the pT-category predicted local recurrence. Depth of infiltration can be used in the identification of at risk patients but no clear cut-off value separating high-risk and low-risk patients was found. None of the histopathological parameters examined predicted survival. Sentinel lymph node (SLN) biopsy was studied as a means of diagnosing patients with subclinical cervical metastases. SLN biopsy was applied to 46 patients who underwent elective neck dissection for oral squamous cell carcinoma. In addition, SLN biopsy was applied to 13 patients with small oral cavity tumours who were not intended to undergo elective neck dissection because of low risk of occult metastasis. The sensitivity of SLN biopsy for finding subclinical cervical metastases was found to be 67%, when SLN status was compared to the metastatic status of the rest of the neck dissection specimen. Of the patients not planned to have elective neck dissection, SLN biopsy revealed cervical metastasis in 15% of the patients. Our results suggest that SLN biopsy can not yet entirely replace elective neck dissection in the treatment of oral cancer, but it seems beneficial for patients with low risk of metastasis who are not intended for elective neck treatment according to current treatment protocols.
Resumo:
This study is one part of a collaborative depression research project, the Vantaa Depression Study (VDS), involving the Department of Mental and Alcohol Research of the National Public Health Institute, Helsinki, and the Department of Psychiatry of the Peijas Medical Care District (PMCD), Vantaa, Finland. The VDS includes two parts, a record-based study consisting of 803 patients, and a prospective, naturalistic cohort study of 269 patients. Both studies include secondary-level care psychiatric out- and inpatients with a new episode of major depressive disorder (MDD). Data for the record-based part of the study came from a computerised patient database incorporating all outpatient visits as well as treatment periods at the inpatient unit. We included all patients aged 20 to 59 years old who had been assigned a clinical diagnosis of depressive episode or recurrent depressive disorder according to the International Classification of Diseases, 10th edition (ICD-10) criteria and who had at least one outpatient visit or day as an inpatient in the PMCD during the study period January 1, 1996, to December 31, 1996. All those with an earlier diagnosis of schizophrenia, other non-affective psychosis, or bipolar disorder were excluded. Patients treated in the somatic departments of Peijas Hospital and those who had consulted but not received treatment from the psychiatric consultation services were excluded. The study sample comprised 290 male and 513 female patients. All their psychiatric records were reviewed and each patient completed a structured form with 57 items. The treatment provided was reviewed up to the end of the depression episode or to the end of 1997. Most (84%) of the patients received antidepressants, including a minority (11%) on treatment with clearly subtherapeutic low doses. During the treatment period the depressed patients investigated averaged only a few visits to psychiatrists (median two visits), but more to other health professionals (median seven). One-fifth of both genders were inpatients, with a mean of nearly two inpatient treatment periods during the overall treatment period investigated. The median length of a hospital stay was 2 weeks. Use of antidepressants was quite conservative: The first antidepressant had been switched to another compound in only about one-fifth (22%) of patients, and only two patients had received up to five antidepressant trials. Only 7% of those prescribed any antidepressant received two antidepressants simultaneously. None of the patients was prescribed any other augmentation medication. Refusing antidepressant treatment was the most common explanation for receiving no antidepressants. During the treatment period, 19% of those not already receiving a disability pension were granted one due to psychiatric illness. These patients were nearly nine years older than those not pensioned. They were also more severely ill, made significantly more visits to professionals and received significantly more concomitant medications (hypnotics, anxiolytics, and neuroleptics) than did those receiving no pension. In the prospective part of the VDS, 806 adult patients were screened (aged 20-59 years) in the PMCD for a possible new episode of DSM-IV MDD. Of these, 542 patients were interviewed face-to-face with the WHO Schedules for Clinical Assessment in Neuropsychiatry (SCAN), Version 2.0. Exclusion criteria were the same as in the record-based part of the VDS. Of these, 542 269 patients fulfiled the criteria of DSM-IV MDE. This study investigated factors associated with patients' functional disability, social adjustment, and work disability (being on sick-leave or being granted a disability pension). In the beginning of the treatment the most important single factor associated with overall social and functional disability was found to be severity of depression, but older age and personality disorders also significantly contributed. Total duration and severity of depression, phobic disorders, alcoholism, and personality disorders all independently contributed to poor social adjustment. Of those who were employed, almost half (43%) were on sick-leave. Besides severity and number of episodes of depression, female gender and age over 50 years strongly and independently predicted being on sick-leave. Factors influencing social and occupational disability and social adjustment among patients with MDD were studied prospectively during an 18-month follow-up period. Patients' functional disability and social adjustment were alleviated during the follow-up concurrently with recovery from depression. The current level of functioning and social adjustment of a patient with depression was predicted by severity of depression, recurrence before baseline and during follow-up, lack of full remission, and time spent depressed. Comorbid psychiatric disorders, personality traits (neuroticism), and perceived social support also had a significant influence. During the 18-month follow-up period, of the 269, 13 (5%) patients switched to bipolar disorder, and 58 (20%) dropped out. Of the 198, 186 (94%) patients were at baseline not pensioned, and they were investigated. Of them, 21 were granted a disability pension during the follow-up. Those who received a pension were significantly older, more seldom had vocational education, and were more often on sick-leave than those not pensioned, but did not differ with regard to any other sociodemographic or clinical factors. Patients with MDD received mostly adequate antidepressant treatment, but problems existed in treatment intensity and monitoring. It is challenging to find those at greatest risk for disability and to provide them adequate and efficacious treatment. This includes great challenges to the whole society to provide sufficient resources.
Resumo:
There is an ongoing controversy as to which methods in total hip arthroplasty (THA) could provide young patients with best long-term results. THA is an especially demanding operation in patients with severely dysplastic hips. The optimal surgical treatment for these patients also remains controversial. The aim of this study was to evaluate the long-term survival of THA in young patients (<55 years at the time of the primary operation) on a nation-wide level, and to analyze the long-term clinical and radio-graphical outcome of uncemented THA in patients with severely dysplastic joints. Survival of 4661 primary THAs performed for primary osteoarthritis (OA), 2557 primary THAs per-formed for rheumatoid arthritis (RA), and modern uncemented THA designs performed for primary OA in young patients, were analysed from the Finnish Arthroplasty Register. A total of 68 THAs were per-formed in 56 consecutive patients with high congenital hip dislocation between 1989-1994, and 68 THAs were performed in 59 consecutive patients with severely dysplastic hips and a previous Schanz osteotomy of the femur between 1988-1995 at the Orton Orthopaedic Hospital, Helsinki, Finland. These patients underwent a detailed physical and radiographical evaluation at a mean of 12.3 years and 13.0 years postoperatively, respectively. The risk of stem revision due to aseptic loosening in young patients with primary OA was higher for cemented stems than for proximally porous-coated or HA-coated uncemented stems implanted over the 1991-2001 period. There was no difference in the risk of revision between all-poly cemented-cups and press-fit porous-coated uncemented cups implanted during the same period, when the end point was defined as any revision (including exchange of liner). All uncemented stem designs studied in young patients with primary OA had >90% survival rates at 10 years. The Biomet Bi-Metric stem had a 95% (95% CI 93-97) survival rate even at 15 years. When the end point was defined as any revision, 10 year survival rates of all uncemented cup designs except the Harris-Galante II decreased to <80%. In young patients with RA, the risk of stem revision due to aseptic loosening was higher with cemented stems than with proximally porous-coated uncemented stems. In contrast, the risk of cup revision was higher for all uncemented cup concepts than for all-poly cemented cups with any type of cup revision as the end point. The Harris hip score increased significantly (p<0.001) both in patients with high con-genital hip dislocation and in patients with severely dysplastic hips and a previous Schanz osteotomy, treated with uncemented THA. There was a negative Trendelenburg sign in 92% and in 88% of hips, respectively. There were 12 (18%) and 15 (22%) perioperative complications. The rate of survival for the CDH femoral components, with revision due to aseptic loosening as the end point, was 98% (95% CI 97-100) at 10 years in patients with high hip dislocation and 92% (95% CI, 86-99) at 14 years in patients with a previous Schanz osteotomy. The rate of survival for press-fit, porous-coated acetabular components, with revision due to aseptic loosening as the end point, was 95% (95% CI 89-100) at 10 years in patients with high hip dislocation, and 98% (95% CI 89-100) in patients with a previous Schanz osteotomy. When revision of the cup for any reason was defined as the end point, 10 year sur-vival rates declined to 88% (95% CI 81-95) and to 69% (95% CI, 56-82), respectively. For young patients with primary OA, uncemented proximally circumferentially porous- and HA-coated stems are the implants of choice. However, survival rates of modern uncemented cups are no better than that of all-poly cemented cups. Uncemented proximally circumferentially porous-coated stems and cemented all-poly cups are currently the implants of choice for young patients with RA. Uncemented THA, with placement of the cup at the level of the true acetabulum, distal advancement of the greater trochanter and femoral shortening osteotomy provided patients with high congenital hip dislocation good long-term outcomes. Most of the patients with severely dysplastic hips and a previous Schanz osteotomy can be successfully treated with the same method. However, the subtrochanteric segmental shortening with angular correction gives better leg length correction for the patients with a previous low-seated unilateral Schanz osteotomy.
Resumo:
Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.
Resumo:
Essential thrombocythaemia (ET) is a myeloproliferative disease (MPD) characterized by thrombocytosis, i.e. a constant elevation of platelet count. Thrombocytosis may appear in MPDs (ET, polycythaemia vera, chronic myeloid leukaemia, myelofibrosis) and as a reactive phenomenon. The differential diagnosis of thrombocytosis is important, because the clinical course, need of therapy, and prognosis are different in patients with MPDs and in those with reactive thrombocytosis. ET patients may remain asymptomatic for years, but serious thrombohaemorrhagic and pregnancy-related complications may occur. The complications are difficult to predict. The aims of the present study were to evaluate the diagnostic findings, clinical course, and prognostic factors of ET. The present retrospective study consists of 170 ET patients. Two thirds had a platelet count < 1000 x 109/l. The diagnosis was supported by an increased number of megakaryocytes with an abnormal morphology in a bone marrow aspirate, aggregation defects in platelet function studies, and the presence of spontaneous erythroid and/or megakaryocytic colony formation in in vitro cultures of haematopoietic progenitors. About 70 % of the patients had spontaneous colony formation, while about 30 % had a normal growth pattern. Only a fifth of the patients remained asymptomatic. Half had a major thrombohaemorrhagic complication. The proportion of the patients suffering from thrombosis was as high as 45 %. About a fifth had major bleedings. Half of the patients had microvascular symptoms. Age over 60 years increased the risk of major bleedings, but the occurrence of thrombotic complications was similar in all age groups. Male gender, smoking in female patients, the presence of any spontaneous colony formation, and the presence of spontaneous megakaryocytic colony formation in younger patients were identified as risk factors for thrombosis. Pregnant ET patients had an increased risk of complications. Forty-five per cent of the pregnancies were complicated and 38 % of them ended in stillbirth. Treatment with acetylsalicylic acid alone or in combination with platelet lowering drugs improved the outcome of the pregnancy. The present findings about risk factors in ET as well as treatment outcome in the pregnancies of ET patients should be taken into account when planning treatment strategies for Finnish patients.
Resumo:
The `VuoKKo` trial consisted of 236 women referred and randomised due to menorrhagia in the five university hospitals of Finland between November 1994 and November 1997. Of these women, 117 were randomised to hysterectomy and 119 to use levonorgestrel-releasing intrauterine system (LNG-IUS) to treat this complaint. Their follow-up visits took place six and twelve months after the treatment and five years after the randomisation. The first aim in the primary trial was quality-of-life and monetary aspects, and secondly in the present study to compare ovarian function, bone mineral density (BMD) and sexual functioning after these two treatment options. Ovarian function seemed to decrease after hysterectomy, demonstrated by increased hot flashes and serum follicle-stimulating hormone concentrations twelve months after the operation. Such an increase was not seen among LNG-IUS users. The pulsatility index of intraovarian arteries measured by two-dimensional ultrasound decreased in the hysterectomy group, but not in the LNG-IUS group. The decrease in serum inhibin B concentrations was similar in both groups, while ovarian artery circulation remained unchanged. BMD of the women measured by dual x-ray absorptiometry (DXA) at the lumbar spine and femoral neck at baseline and at five years after treatment showed BMD decrease at the lumbar spine among hysterectomised women, but not among LNG-IUS users. In both groups, BMD at the femoral neck had decreased. Differences between the groups were not, however, significant. Sexual functioning assessed by McCoy s sexual scale showed that sexual satisfaction as well as intercourse frequency had increased and sexual problems decreased among hysterectomised women six months after treatment. Among LNG-IUS users, sexual satisfaction and sexual problems remained unchanged. Although, the two groups did not differ in terms of sexual satisfaction or sexual problems at one-year and five-year follow-ups, LNG-IUS users were less satisfied with their partners than hysterectomised women.
Resumo:
Exposure to water-damaged buildings and the associated health problems have evoked concern and created confusion during the past 20 years. Individuals exposed to moisture problem buildings report adverse health effects such as non-specific respiratory symptoms. Microbes, especially fungi, growing on the damp material have been considered as potential sources of the health problems encountered in these buildings. Fungi and their airborne fungal spores contain allergens and secondary metabolites which may trigger allergic as well as inflammatory types of responses in the eyes and airways. Although epidemiological studies have revealed an association between damp buildings and health problems, no direct cause-and-effect relationship has been established. Further knowledge is needed about the epidemiology and the mechanisms leading to the symptoms associated with exposure to fungi. Two different approaches have been used in this thesis in order to investigate the diverse health effects associated with exposure to moulds. In the first part, sensitization to moulds was evaluated and potential cross-reactivity studied in patients attending a hospital for suspected allergy. In the second part, one typical mould known to be found in water-damaged buildings and to produce toxic secondary metabolites was used to study the airway responses in an experimental model. Exposure studies were performed on both naive and allergen sensitized mice. The first part of the study showed that mould allergy is rare and highly dependent on the atopic status of the examined individual. The prevalence of sensitization was 2.7% to Cladosporium herbarum and 2.8% to Alternaria alternata in patients, the majority of whom were atopic subjects. Some of the patients sensitized to mould suffered from atopic eczema. Frequently the patients were observed to possess specific serum IgE antibodies to a yeast present in the normal skin flora, Pityrosporum ovale. In some of these patients, the IgE binding was partly found to be due to binding to shared glycoproteins in the mould and yeast allergen extracts. The second part of the study revealed that exposure to Stachybotrys chartarum spores induced an airway inflammation in the lungs of mice. The inflammation was characterized by an influx of inflammatory cells, mainly neutrophils and lymphocytes, into the lungs but with almost no differences in airway responses seen between the satratoxin producing and non-satratoxin producing strain. On the other hand, when mice were exposed to S. chartarum and sensitized/challenged with ovalbumin the extent of the inflammation was markedly enhanced. A synergistic increase in the numbers of inflammatory cells was seen in BAL and severe inflammation was observed in the histological lung sections. In conclusion, the results in this thesis imply that exposure to moulds in water damaged buildings may trigger health effects in susceptible individuals. The symptoms can rarely be explained by IgE mediated allergy to moulds. Other non-allergic mechanisms seem to be involved. Stachybotrys chartarum is one of the moulds potentially responsible for health problems. In this thesis, new reaction models for the airway inflammation induced by S. chartarum have been found using experimental approaches. The immunological status played an important role in the airway inflammation, enhancing the effects of mould exposure. The results imply that sensitized individuals may be more susceptible to exposure to moulds than non-sensitized individuals.
Resumo:
Background. Cardiovascular disease (CVD) remains the most serious threat to life and health in industrialized countries. Atherosclerosis is the main underlying pathology associated with CVD, in particular coronary artery disease (CAD), ischaemic stroke, and peripheral arterial disease. Risk factors play an important role in initiating and accelerating the complex process of atherosclerosis. Most studies of risk factors have focused on the presence or absence of clinically defined CVD. Less is known about the determinants of the severity and extent of atherosclerosis in symptomatic patients. Aims. To clarify the association between coronary and carotid artery atherosclerosis, and to study the determinants associated with these abnormalities with special regard to novel cardiovascular risk factors. Subjects and methods. Quantitative coronary angiography (QCA) and B-mode ultrasound were used to assess coronary and carotid artery atherosclerosis in 108 patients with clinically suspected CAD referred for elective coronary angiography. To evaluate anatomic severity and extent of CAD, several QCA parameters were incorporated into indexes. These measurements reflected CAD severity, extent, and overall atheroma burden and were calculated for the entire coronary tree and separately for different coronary segments (i.e., left main, proximal, mid, and distal segments). Maximum and mean intima-media thickness (IMT) values of carotid arteries were measured and expressed as mean aggregate values. Furthermore, the study design included extensive fasting blood samples, oral glucose tolerance test, and an oral fat-load test to be performed in each participant. Results. Maximum and mean IMT values were significantly correlated with CAD severity, extent, and atheroma burden. There was heterogeneity in associations between IMT and CAD indexes according to anatomical location of CAD. Maximum and mean IMT values, respectively, were correlated with QCA indexes for mid and distal segments but not with the proximal segments of coronary vessels. The values of paraoxonase-1 (PON1) activity and concentration, respectively, were lower in subjects with significant CAD and there was a significant relationship between PON1 activity and concentration and coronary atherosclerosis assessed by QCA. PON1 activity was a significant determinant of severity of CAD independently of HDL cholesterol. Neither PON1 activity nor concentration was associated with carotid IMT. The concentration of triglycerides (TGs), triglyceride-rich lipoproteins (TRLs), oxidized LDL (oxLDL), and the cholesterol content of remnant lipoprotein particle (RLP-C) were significantly increased at 6 hours after intake of an oral fatty meal as compared with fasting values. The mean peak size of LDL remained unchanged 6 hours after the test meal. The correlations between total TGs, TRLs, and RLP-C in fasting and postprandial state were highly significant. RLP-C correlated with oxLDL both in fasting and in fed state and inversely with LDL size. In multivariate analysis oxLDL was a determinant of severity and extent of CAD. Neither total TGs, TRLs, oxLDL, nor LDL size were linked to carotid atherosclerosis. Insulin resistance (IR) was associated with an increased severity and extent of coronary atherosclerosis and seemed to be a stronger predictor of coronary atherosclerosis in the distal parts of the coronary tree than in the proximal and mid parts. In the multivariate analysis IR was a significant predictor of the severity of CAD. IR did not correlate with carotid IMT. Maximum and mean carotid IMT were higher in patients with the apoE4 phenotype compared with subjects with the apoE3 phenotype. Likewise, patients with the apoE4 phenotype had a more severe and extensive CAD than individuals with the apoE3 phenotype. Conclusions. 1) There is an association between carotid IMT and the severity and extent of CAD. Carotid IMT seems to be a weaker predictor of coronary atherosclerosis in the proximal parts of the coronary tree than in the mid and distal parts. 2) PON1 activity has an important role in the pathogenesis of coronary atherosclerosis. More importantly, the study illustrates how the protective role of HDL could be modulated by its components such that equivalent serum concentrations of HDL cholesterol may not equate with an equivalent, potential protective capacity. 3) RLP-C in the fasting state is a good marker of postprandial TRLs. Circulating oxLDL increases in CAD patients postprandially. The highly significant positive correlation between postprandial TRLs and postprandial oxLDL suggests that the postprandial state creates oxidative stress. Our findings emphasize the fundamental role of LDL oxidation in the development of atherosclerosis even after inclusion of conventional CAD risk factors. 4) Disturbances in glucose metabolism are crucial in the pathogenesis of coronary atherosclerosis. In fact, subjects with IR are comparable with diabetic subjects in terms of severity and extent of CAD. 5) ApoE polymorphism is involved in the susceptibility to both carotid and coronary atherosclerosis.
Resumo:
Visual acuities at the time of referral and on the day before surgery were compared in 124 patients operated on for cataract in Vaasa Central Hospital, Finland. Preoperative visual acuity and the occurrence of ocular and general disease were compared in samples of consecutive cataract extractions performed in 1982, 1985, 1990, 1995 and 2000 in two hospitals in the Vaasa region in Finland. The repeatability and standard deviation of random measurement error in visual acuity and refractive error determination in a clinical environment in cataractous, pseudophakic and healthy eyes were estimated by re-examining visual acuity and refractive error of patients referred to cataract surgery or consultation by ophthalmic professionals. Altogether 99 eyes of 99 persons (41 cataractous, 36 pseudophakic and 22 healthy eyes) with a visual acuity range of Snellen 0.3 to 1.3 (0.52 to -0.11 logMAR) were examined. During an average waiting time of 13 months, visual acuity in the study eye decreased from 0.68 logMAR to 0.96 logMAR (from 0.2 to 0.1 in Snellen decimal values). The average decrease in vision was 0.27 logMAR per year. In the fastest quartile, visual acuity change per year was 0.75 logMAR, and in the second fastest 0.29 logMAR, the third and fourth quartiles were virtually unaffected. From 1982 to 2000, the incidence of cataract surgery increased from 1.0 to 7.2 operations per 1000 inhabitants per year in the Vaasa region. The average preoperative visual acuity in the operated eye increased by 0.85 logMAR (in decimal values from 0.03to 0.2) and in the better eye 0.27 logMAR (in decimal values from 0.23 to 0.43) over this period. The proportion of patients profoundly visually handicapped (VA in the better eye <0.1) before the operation fell from 15% to 4%, and that of patients less profoundly visually handicapped (VA in the better eye 0.1 to <0.3) from 47% to 15%. The repeatability visual acuity measurement estimated as a coefficient of repeatability for all 99 eyes was ±0.18 logMAR, and the standard deviation of measurement error was 0.06 logMAR. Eyes with the lowest visual acuity (0.3-0.45) had the largest variability, the coefficient of repeatability values being ±0.24 logMAR and eyes with a visual acuity of 0.7 or better had the smallest, ±0.12 logMAR. The repeatability of refractive error measurement was studied in the same patient material as the repeatability of visual acuity. Differences between measurements 1 and 2 were calculated as three-dimensional vector values and spherical equivalents and expressed by coefficients of repeatability. Coefficients of repeatability for all eyes for vertical, torsional and horisontal vectors were ±0.74D, ±0.34D and ±0.93D, respectively, and for spherical equivalent for all eyes ±0.74D. Eyes with lower visual acuity (0.3-0.45) had larger variability in vector and spherical equivalent values (±1.14), but the difference between visual acuity groups was not statistically significant. The difference in the mean defocus equivalent between measurements 1 and 2 was, however, significantly greater in the lower visual acuity group. If a change of ±0.5D (measured in defocus equivalents) is accepted as a basis for change of spectacles for eyes with good vision, the basis for eyes in the visual acuity range of 0.3 - 0.65 would be ±1D. Differences in repeated visual acuity measurements are partly explained by errors in refractive error measurements.