41 resultados para <0.5 µm
Resumo:
Severe sepsis is associated with common occurrence, high costs of care and significant mortality. The incidence of severe sepsis has been reported to vary between 0.5/1000 and 3/1000 in different studies. The worldwide Severe Sepsis Campaign, guidelines and treatment protocols aim at decreasing severe sepsis associated high morbidity and mortality. Various mediators of inflammation, such as high mobility group box-1 protein (HMGB1) and vascular endothelial growth factor (VEGF), have been tested for severity of illness and outcome in severe sepsis. Long-term survival with quality of life (QOL) assessment is important outcome after severe sepsis. The objective of this study was to evaluate the incidence, severity of organ dysfunction and outcome of severe sepsis in intensive care treated patients in Finland (study I)). HMGB1 and VEGF were studied in predicting severity of illness, development and type of organ dysfunction and hospital mortality (studies II and III). The long-term outcome and quality of life were assessed and quality-adjusted life years and cost per one QALY were estimated (study IV). A total of 470 patients with severe sepsis were included in the Finnsepsis Study. Patients were treated in 24 Finnish intensive care units in a 4-month period from 1 November 2004 to 28 February 2005. The incidence of severe sepsis was 0.38 /1,000 in the adult population (95% confidence interval 0.34-0.41). Septic shock (77%), severe oxygenation impairment (71.4%) and acute renal failure (23.2%) were the most common organ failures. The ICU, hospital, one-year and two-year mortalities were 15.5%, 28.3%, 40.9% and 44.9% respectively. HMGB1 and VEGF were elevated in patients with severe sepsis. VEGF concentrations were lower in non-survivors than in survivors, but HMGB1 levels did not differ between patients. Neither HMGB1 nor VEGF were predictive of hospital mortality. The QOL was measured median 17 months after severe sepsis and QOL was lower than in reference population. The mean QALY was 15.2 years for a surviving patient and the cost for one QALY was 2,139 . The study showed that the incidence of severe sepsis is lower in Finland than in other countries. The short-term outcome is comparable with that in other countries, but long-term outcome is poor. HMGB1 and VEGF are not useful in predicting mortality in severe sepsis. The mean QALY for a surviving patient is 15.2 and as the cost for one QALY is reasonably low, the intensive care is cost-effective in patients with severe sepsis.
Resumo:
Several hypnosis monitoring systems based on the processed electroencephalogram (EEG) have been developed for use during general anesthesia. The assessment of the analgesic component (antinociception) of general anesthesia is an emerging field of research. This study investigated the interaction of hypnosis and antinociception, the association of several physiological variables with the degree of intraoperative nociception, and aspects of EEG Bispectral Index Scale (BIS) monitoring during general anesthesia. In addition, EEG features and heart rate (HR) responses during desflurane and sevoflurane anesthesia were compared. A propofol bolus of 0.7 mg/kg was more effective than an alfentanil bolus of 0.5 mg in preventing the recurrence of movement responses during uterine dilatation and curettage (D C) after a propofol-alfentanil induction, combined with nitrous oxide (N2O). HR and several HR variability-, frontal electromyography (fEMG)-, pulse plethysmography (PPG)-, and EEG-derived variables were associated with surgery-induced movement responses. Movers were discriminated from non-movers mostly by the post-stimulus values per se or normalized with respect to the pre-stimulus values. In logistic regression analysis, the best classification performance was achieved with the combination of normalized fEMG power and HR during D C (overall accuracy 81%, sensitivity 53%, specificity 95%), and with the combination of normalized fEMG-related response entropy, electrocardiography (ECG) R-to-R interval (RRI), and PPG dicrotic notch amplitude during sevoflurane anesthesia (overall accuracy 96%, sensitivity 90%, specificity 100%). ECG electrode impedances after alcohol swab skin pretreatment alone were higher than impedances of designated EEG electrodes. The BIS values registered with ECG electrodes were higher than those registered simultaneously with EEG electrodes. No significant difference in the time to home-readiness after isoflurane-N2O or sevoflurane-N2O anesthesia was found, when the administration of the volatile agent was guided by BIS monitoring. All other early and intermediate recovery parameters were also similar. Transient epileptiform EEG activity was detected in eight of 15 sevoflurane patients during a rapid increase in the inspired volatile concentration, and in none of the 16 desflurane patients. The observed transient EEG changes did not adversely affect the recovery of the patients. Following the rapid increase in the inhaled desflurane concentration, HR increased transiently, reaching its maximum in two minutes. In the sevoflurane group, the increase was slower and more subtle. In conclusion, desflurane may be a safer volatile agent than sevoflurane in patients with a lowered seizure threshold. The tachycardia induced by a rapid increase in the inspired desflurane concentration may present a risk for patients with heart disease. Designated EEG electrodes may be superior to ECG electrodes in EEG BIS monitoring. When the administration of isoflurane or sevoflurane is adjusted to maintain BIS values at 50-60 in healthy ambulatory surgery patients, the speed and quality of recovery are similar after both isoflurane-N2O and sevoflurane-N2O anesthesia. When anesthesia is maintained by the inhalation of N2O and bolus doses of propofol and alfentanil in healthy unparalyzed patients, movement responses may be best avoided by ensuring a relatively deep hypnotic level with propofol. HR/RRI, fEMG, and PPG dicrotic notch amplitude are potential indicators of nociception during anesthesia, but their performance needs to be validated in future studies. Combining information from different sources may improve the discrimination of the level of nociception.
Resumo:
Visual acuities at the time of referral and on the day before surgery were compared in 124 patients operated on for cataract in Vaasa Central Hospital, Finland. Preoperative visual acuity and the occurrence of ocular and general disease were compared in samples of consecutive cataract extractions performed in 1982, 1985, 1990, 1995 and 2000 in two hospitals in the Vaasa region in Finland. The repeatability and standard deviation of random measurement error in visual acuity and refractive error determination in a clinical environment in cataractous, pseudophakic and healthy eyes were estimated by re-examining visual acuity and refractive error of patients referred to cataract surgery or consultation by ophthalmic professionals. Altogether 99 eyes of 99 persons (41 cataractous, 36 pseudophakic and 22 healthy eyes) with a visual acuity range of Snellen 0.3 to 1.3 (0.52 to -0.11 logMAR) were examined. During an average waiting time of 13 months, visual acuity in the study eye decreased from 0.68 logMAR to 0.96 logMAR (from 0.2 to 0.1 in Snellen decimal values). The average decrease in vision was 0.27 logMAR per year. In the fastest quartile, visual acuity change per year was 0.75 logMAR, and in the second fastest 0.29 logMAR, the third and fourth quartiles were virtually unaffected. From 1982 to 2000, the incidence of cataract surgery increased from 1.0 to 7.2 operations per 1000 inhabitants per year in the Vaasa region. The average preoperative visual acuity in the operated eye increased by 0.85 logMAR (in decimal values from 0.03to 0.2) and in the better eye 0.27 logMAR (in decimal values from 0.23 to 0.43) over this period. The proportion of patients profoundly visually handicapped (VA in the better eye <0.1) before the operation fell from 15% to 4%, and that of patients less profoundly visually handicapped (VA in the better eye 0.1 to <0.3) from 47% to 15%. The repeatability visual acuity measurement estimated as a coefficient of repeatability for all 99 eyes was ±0.18 logMAR, and the standard deviation of measurement error was 0.06 logMAR. Eyes with the lowest visual acuity (0.3-0.45) had the largest variability, the coefficient of repeatability values being ±0.24 logMAR and eyes with a visual acuity of 0.7 or better had the smallest, ±0.12 logMAR. The repeatability of refractive error measurement was studied in the same patient material as the repeatability of visual acuity. Differences between measurements 1 and 2 were calculated as three-dimensional vector values and spherical equivalents and expressed by coefficients of repeatability. Coefficients of repeatability for all eyes for vertical, torsional and horisontal vectors were ±0.74D, ±0.34D and ±0.93D, respectively, and for spherical equivalent for all eyes ±0.74D. Eyes with lower visual acuity (0.3-0.45) had larger variability in vector and spherical equivalent values (±1.14), but the difference between visual acuity groups was not statistically significant. The difference in the mean defocus equivalent between measurements 1 and 2 was, however, significantly greater in the lower visual acuity group. If a change of ±0.5D (measured in defocus equivalents) is accepted as a basis for change of spectacles for eyes with good vision, the basis for eyes in the visual acuity range of 0.3 - 0.65 would be ±1D. Differences in repeated visual acuity measurements are partly explained by errors in refractive error measurements.
Resumo:
The prevalence of variegate porphyria (VP) (2.1:100 000, in 2006 n=108) was higher in Finland than elsewhere in European countries due to a founder effect (R152C). The incidence of VP was estimated at 0.2:1 000 000 based on the number of new symptomatic patients yearly. The prevalence of porphyria cutanea tarda (PCT) was 1.2:100 000 (in 2006 n=63), which is only one fourth of the numbers reported from other European countries. The estimated incidence of PCT was 0.5:1 000 000. Based on measurements of the uroporphyrinogen decarboxylase activity in erythrocytes, the proportion of familial PCT was 49% of the cases. The prevalence of erythropoietic protoporphyria (EPP) was at 0.8:100 000 (in 2006 n=39) including asymptomatic carriers of a mutation in the ferrochelatase (FECH) gene. The incidence of EPP was estimated at 0.1:1 000 000. After 1980 the penetrance was 37% among patients with VP. Of the mutation carriers (n=57) 30% manifested with skin symptoms. Frequency of skin symptom as only clinical sign was stable before or after 1980 (22% vs. 21%), but acute attacks became infrequent (29% vs. 7%). Of the symptomatic patients 30% had both acute attacks and skin symptoms and 80% had skin symptoms. Fragility (95%) and blistering (46%) of the skin in the backs of the hands were the most common skin symptoms. Transient correction of porphyrin metabolism using eight haem arginate infusions within five weeks had no effect on the skin symptoms in three of four patients with VP. In one case skin symptoms disappeared transiently. One patient with homozygous VP had severe photosensitivity since birth. Sensory polyneuropathy, glaucoma and renal failure developed during the 25-year follow-up without the presence of acute attacks. The I12T mutation was detected in both of his alleles in the protoporphyrinogen oxidase gene. Lack of skin symptoms and infrequency of acute attacks (1/9) in the patients with I12T mutation at the heterozygous stage indicate a mild phenotype (the penetrance 11%). Four mutations (751delGAGAA, 1122delT, C286T, C343T) in the FECH gene were characterised in four of 15 families with EPP. Burning pain (96%) and swelling (92%) of the sun-exposed skin were the major skin symptoms. Hepatopathy appeared in one of 25 symptomatic patients (4%). Clinical manifestations and associated factors of PCT were similar in the sporadic and familial types of PCT. The majority of the patients with PCT had one to three precipitating factors: alcohol intake (78%), mutations in hemochromatosis associated gene (50%), use of oestrogen (25% of women) and hepatitis B or C infections (25 %). Fatty liver disease (67%) and siderosis (67%) were commonly found in their liver biopsies. The major histopathological change of the sun-exposed skin in the patients with VP (n=20), EPP (n=8) and PCT (n=5) was thickening of the vessel walls of the upper dermis suggesting that the vessel wall is the primary site of the phototoxic reaction in each type of porphyria. The fine structure of the vessel walls was similar in VP, EPP and PCT consisting of the multilayered basement membrane and excess of finely granular substance between the layers which were surrounded by the band of homogenous material. EPP was characterised by amorphous perivascular deposits extending also to the extravascular space. In direct immunofluorescence study homogenous IgG deposits in the vessel walls of the upper dermis of the sun-exposed skin were demonstrated in each type of porphyria. In EPP the excess material around vessel walls consisted of other proteins such as serum amyloid protein, and kappa and lambda light chains in addition to the basement membrane constituents such as collagen IV and laminin. These results suggest that the alterations of the vessel walls are a consequence of the repeated damage and the repairing process in the vessel wall. The microscopic alterations could be demonstrated even in the normal looking but sun-exposed skin of the patients with EPP during the symptom-free phase suggesting that vascular change can be chronic. The stability of vascular changes in the patients with PCT after treatment indicates that circulating porphyrins are not important for the maintenance of the changes.
Resumo:
The purpose of this study was to estimate the prevalence and distribution of reduced visual acuity, major chronic eye diseases, and subsequent need for eye care services in the Finnish adult population comprising persons aged 30 years and older. In addition, we analyzed the effect of decreased vision on functioning and need for assistance using the World Health Organization’s (WHO) International Classification of Functioning, Disability, and Health (ICF) as a framework. The study was based on the Health 2000 health examination survey, a nationally representative population-based comprehensive survey of health and functional capacity carried out in 2000 to 2001 in Finland. The study sample representing the Finnish population aged 30 years and older was drawn by a two-stage stratified cluster sampling. The Health 2000 survey included a home interview and a comprehensive health examination conducted at a nearby screening center. If the invited participants did not attend, an abridged examination was conducted at home or in an institution. Based on our finding in participants, the great majority (96%) of Finnish adults had at least moderate visual acuity (VA ≥ 0.5) with current refraction correction, if any. However, in the age group 75–84 years the prevalence decreased to 81%, and after 85 years to 46%. In the population aged 30 years and older, the prevalence of habitual visual impairment (VA ≤ 0.25) was 1.6%, and 0.5% were blind (VA < 0.1). The prevalence of visual impairment increased significantly with age (p < 0.001), and after the age of 65 years the increase was sharp. Visual impairment was equally common for both sexes (OR 1.20, 95% CI 0.82 – 1.74). Based on self-reported and/or register-based data, the estimated total prevalences of cataract, glaucoma, age-related maculopathy (ARM), and diabetic retinopathy (DR) in the study population were 10%, 5%, 4%, and 1%, respectively. The prevalence of all of these chronic eye diseases increased with age (p < 0.001). Cataract and glaucoma were more common in women than in men (OR 1.55, 95% CI 1.26 – 1.91 and OR 1.57, 95% CI 1.24 – 1.98, respectively). The most prevalent eye diseases in people with visual impairment (VA ≤ 0.25) were ARM (37%), unoperated cataract (27%), glaucoma (22%), and DR (7%). One-half (58%) of visually impaired people had had a vision examination during the past five years, and 79% had received some vision rehabilitation services, mainly in the form of spectacles (70%). Only one-third (31%) had received formal low vision rehabilitation (i.e., fitting of low vision aids, receiving patient education, training for orientation and mobility, training for activities of daily living (ADL), or consultation with a social worker). People with low vision (VA 0.1 – 0.25) were less likely to have received formal low vision rehabilitation, magnifying glasses, or other low vision aids than blind people (VA < 0.1). Furthermore, low cognitive capacity and living in an institution were associated with limited use of vision rehabilitation services. Of the visually impaired living in the community, 71% reported a need for assistance and 24% had an unmet need for assistance in everyday activities. Prevalence of ADL, instrumental activities of daily living (IADL), and mobility increased with decreasing VA (p < 0.001). Visually impaired persons (VA ≤ 0.25) were four times more likely to have ADL disabilities than those with good VA (VA ≥ 0.8) after adjustment for sociodemographic and behavioral factors and chronic conditions (OR 4.36, 95% CI 2.44 – 7.78). Limitations in IADL and measured mobility were five times as likely (OR 4.82, 95% CI 2.38 – 9.76 and OR 5.37, 95% CI 2.44 – 7.78, respectively) and self-reported mobility limitations were three times as likely (OR 3.07, 95% CI 1.67 – 9.63) as in persons with good VA. The high prevalence of age-related eye diseases and subsequent visual impairment in the fastest growing segment of the population will result in a substantial increase in the demand for eye care services in the future. Many of the visually impaired, especially older persons with decreased cognitive capacity or living in an institution, have not had a recent vision examination and lack adequate low vision rehabilitation. This highlights the need for regular evaluation of visual function in the elderly and an active dissemination of information about rehabilitation services. Decreased VA is strongly associated with functional limitations, and even a slight decrease in VA was found to be associated with limited functioning. Thus, continuous efforts are needed to identify and treat eye diseases to maintain patients’ quality of life and to alleviate the social and economic burden of serious eye diseases.
Resumo:
Soy-derived phytoestrogen genistein and 17β-estradiol (E2), the principal endogenous estrogen in women, are also potent antioxidants protecting LDL and HDL lipoproteins against oxidation. This protection is enhanced by esterification with fatty acids, resulting in lipophilic molecules that accumulate in lipoproteins or fatty tissues. The aims were to investigate, whether genistein becomes esterified with fatty acids in human plasma accumulating in lipoproteins, and to develop a method for their quantitation; to study the antioxidant activity of different natural and synthetic estrogens in LDL and HDL; and to determine the E2 esters in visceral and subcutaneous fat in late pregnancy and in pre- and postmenopause. Human plasma was incubated with [3H]genistein and its esters were analyzed from lipoprotein fractions. Time-resolved fluoroimmunoassay (TR-FIA) was used to quantitate genistein esters in monkey plasma after subcutaneous and oral administration. The E2 esters in women s serum and adipose tissue were also quantitated using TR-FIA. The antioxidant activity of estrogen derivatives (n=43) on LDL and HDL was assessed by monitoring the copper induced formation of conjugated dienes. Human plasma was shown to produce lipoprotein-bound genistein fatty acid esters, providing a possible explanation for the previously reported increased oxidation resistance of LDL particles during intake of soybean phytoestrogens. Genistein esters were introduced into blood by subcutaneous administration. The antioxidant effect of estrogens on lipoproteins is highly structure-dependent. LDL and HDL were protected against oxidation by many unesterified, yet lipophilic derivatives. The strongest antioxidants had an unsubstituted A-ring phenolic hydroxyl group with one or two adjacent methoxy groups. E2 ester levels were high during late pregnancy. The median concentration of E2 esters in pregnancy serum was 0.42 nmol/l (n=13) and in pre- (n=8) and postmenopause (n=6) 0.07 and 0.06 nmol/l, respectively. In pregnancy visceral fat the concentration of E2 esters was 4.24 nmol/l and in pre- and postmenopause 0.82 and 0.74 nmol/l. The results from subcutaneous fat were similar. In serum and fat during pregnancy, E2 esters constituted about 0.5 and 10% of the free E2. In non-pregnant women most of the E2 in fat was esterified (the ester/free ratio 150 - 490%). In postmenopause, E2 levels in fat highly exceeded those in serum, the majority being esterified. The pathways for fatty acid esterification of steroid hormones are found in organisms ranging from invertebrates to vertebrates. The evolutionary preservation and relative abundance of E2 esters, especially in fat tissue, suggest a biological function, most likely in providing a readily available source of E2. The body s own estrogen reservoir could be used as a source of E2 by pharmacologically regulating the E2 esterification or hydrolysis.
Resumo:
The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.
Resumo:
The purpose of this study was to evaluate the use of sentinel node biopsy (SNB) in the axillary nodal staging in breast cancer. A special interest was in sentinel node (SN) visualization, intraoperative detection of SN metastases, the feasibility of SNB in patients with pure tubular carcinoma (PTC) and in those with ductal carcinoma in situ (DCIS) in core needle biopsy (CNB) and additionally in the detection of axillary recurrences after tumour negative SNB. Patients and methods. 1580 clinically stage T1-T2 node-negative breast cancer patients, who underwent lymphoscintigraphy (LS), SNB and breast surgery between June 2000 - 2004 at the Breast Surgery Unit. The CNB samples were obtained from women, who participated the biennial, population based mammography screening at the Mammography Screening Centre of Helsinki 2001 - 2004.In the follow- up, a cohort of 205 patients who avoided AC due to negative SNB findings were evaluated using ultrasonography one and three years after breast surgery. Results. The visualization rate of axillary SNs was not enhanced by adjusting radioisotope doses according to BMI. The sensitivity of the intraoperative diagnosis of SN metastases of invasive lobular carcinoma (ILC) was higher, 87%, with rapid, intraoperative immunohistochemistry (IHC) group compared to 66% without it. The prevalence of tumour positive SN findings was 27% in the 33 patients with breast tumours diagnosed as PTC. The median histological tumour size was similar in patients with or without axillary metastases. After the histopathological review, six out of 27 patients with true PTC had axillary metastases, with no significant change in the risk factors for axillary metastases. Of the 67 patients with DCIS in the preoperative percutaneous biopsy specimen , 30% had invasion in the surgical specimen. The strongest predictive factor for invasion was the visibility of the lesion in ultrasound. In the three year follow-up, axillary recurrence was found in only two (0.5%) of the total of 383 ultrasound examinations performed during the study, and only one of the 369 examinations revealed cancer. None of the ultrasound examinations were false positive, and no study participant was subjected to unnecessary surgery due to ultrasound monitoring. Conclusions. Adjusting the dose of the radioactive tracer according to patient BMI does not increase the visualization rate of SNs. The intraoperative diagnosis of SN metastases is enhanced by rapid IHC particularly in patients with ILC. SNB seems to be a feasible method for axillary staging of pure tubular carcinoma in patients with a low prevalence of axillary metatastases. SNB also appears to be a sensible method in patients undergoing mastectomy due to DCIS in CNB. It also seems useful in patients with lesions visible in breast US. During follow-up, routine monitoring of the ipsilateral axilla using US is not worthwhile among breast cancer patients who avoided AC due to negative SN findings.
Resumo:
Head and neck squamous cell cancer (HNSCC) is the sixth most common cancer worldwide. Despite advances in combined modality therapy (surgery, radiotherapy, chemotherapy) the 5-year survival rate in stage III and IV disease remains at 40% - 60%. Short-range Auger-electron emitters, such as In-111 and In-114m, tagged with a drug, molecule, peptide, protein or nanoparticles brought in close proximity to nuclear DNA represent a fascinating alternative for treating cancer. In this thesis, we studied the usefulness of Indium-111-bleomycin complex (In-111-BLMC) in the diagnostics and potential therapy of HNSCC using in vitro HNSCC cell lines, in vivo nude mice, and in vivo HNSCC patients. In in vitro experiments with HNSCC cell lines, the sensitivity to external beam radiation, BLM, In-111-BLMC, and In-111-Cl3 was studied using the 96-well plate clonogenic assay. The influence of BLM and In-111-BLMC on the cell cycle was measured with flow cytometry. In in vivo nude mice xenograft studies, the activity ratios of In-111-BLMC were obtained in gamma camera images. The effect of In-111-BLMC in HNSCC xenografts was studied. In in vivo patient studies, we determined the tumor uptake of In-111-BLMC with gamma camera and the radioactivity from tumor samples using In-111-BLMC with specific activity of 75, 175, or 375 MBq/mg BLM. The S values, i.e. absorbed dose in a target organ per cumulated activity in a source organ, were simulated for In-111 and In-114m. In vitro studies showed the variation of sensitivity for external beam radiation, BLM, and In-111-BLMC between HNSCC cell lines. IC50 values for BLM were 1.6-, 1.8-, and 2.1-fold higher than In-111-BLMC (40 MBq/mg BLM) in three HNSCC cell lines. Specific In-111 activity of 40 MBq/mgBLM was more effective in killing cells than specific In-111 activity of 195MBq/mgBLM (p=0.0023). In-111-Cl3 alone had no killing effect. The percentage of cells in the G2/M phase increased after exposure to BLM and especially to In-111-BLMC in the three cell lines studied, indicating a G2/M block. The tumor-seeking behavior was shown in the in vivo imaging study of xenografted mice. BLM and In-111-BLMC were more effective than NaCl in reducing xenografted tumor size in HNSCC. The uptake ratios received from gamma images in the in vivo patient study varied from 1.2 to 2.8 in malignant tumors. However, the uptake of In-111-BLMC was unaffected by increasing the injected activity. A positive correlation existed between In-111-BLMC uptake, Ki-67/MIB activity, and number of mitoses. Regarding the S values, In-114m delivered a 4-fold absorbed radiation dose into the tumor compared with In-111, and thus, In-114m-BLMC might be more effective than In-111-BLMC at the DNA level. Auger-electron emitters, such as In-111 and In-114m, might have potential in the treatment of HNSCC. Further studies are needed to develop a radiopharmaceutical agent with appropriate physical properties of the radionuclide and a suitable carrier to bring it to the targeted tissue.
Resumo:
Background. Hyperlipidemia is a common concern in patients with heterozygous familial hypercholesterolemia (HeFH) and in cardiac transplant recipients. In both groups, an elevated serum LDL cholesterol level accelerates the development of atherosclerotic vascular disease and increases the rates of cardiovascular morbidity and mortality. The purpose of this study is to assess the pharmacokinetics, efficacy, and safety of cholesterol-lowering pravastatin in children with HeFH and in pediatric cardiac transplant recipients receiving immunosuppressive medication. Patients and Methods. The pharmacokinetics of pravastatin was studied in 20 HeFH children and in 19 pediatric cardiac transplant recipients receiving triple immunosuppression. The patients ingested a single 10-mg dose of pravastatin, and plasma pravastatin concentrations were measured up to 10/24 hours. The efficacy and safety of pravastatin (maximum dose 10 to 60 mg/day and 10 mg/day) up to one to two years were studied in 30 patients with HeFH and in 19 cardiac transplant recipients, respectively. In a subgroup of 16 HeFH children, serum non-cholesterol sterol ratios (102 x mmol/mol of cholesterol), surrogate estimates of cholesterol absorption (cholestanol, campesterol, sitosterol), and synthesis (desmosterol and lathosterol) were studied at study baseline (on plant stanol esters) and during combination with pravastatin and plant stanol esters. In the transplant recipients, the lipoprotein levels and their mass compositions were analyzed before and after one year of pravastatin use, and then compared to values measured from 21 healthy pediatric controls. The transplant recipients were grouped into patients with transplant coronary artery disease (TxCAD) and patients without TxCAD, based on annual angiography evaluations before pravastatin. Results. In the cardiac transplant recipients, the mean area under the plasma concentration-time curve of pravastatin [AUC(0-10)], 264.1 * 192.4 ng.h/mL, was nearly ten-fold higher than in the HeFH children (26.6 * 17.0 ng.h/mL). By 2, 4, 6, 12 and 24 months of treatment, the LDL cholesterol levels in the HeFH children had respectively decreased by 25%, 26%, 29%, 33%, and 32%. In the HeFH group, pravastatin treatment increased the markers of cholesterol absorption and decreased those of synthesis. High ratios of cholestanol to cholesterol were associated with the poor cholesterol-lowering efficacy of pravastatin. In cardiac transplant recipients, pravastatin 10 mg/day lowered the LDL cholesterol by approximately 19%. Compared with the patients without TxCAD, patients with TxCAD had significantly lower HDL cholesterol concentrations and higher apoB-100/apoA-I ratios at baseline (1.0 ± 0.3 mmol/L vs. 1.4 ± 0.3 mmol/L, P = 0.031; and 0.7 ± 0.2 vs. 0.5 ± 0.1, P = 0.034) and after one year of pravastatin use (1.0 ± 0.3 mmol/L vs. 1.4 ± 0.3 mmol/L, P = 0.013; and 0.6 ± 0.2 vs. 0.4 ± 0.1, P = 0.005). Compared with healthy controls, the transplant recipients exhibited elevated serum triglycerides at baseline (median 1.3 [range 0.6-3.2] mmol/L vs. 0.7 [0.3-2.4] mmol/L, P=0.0002), which negatively correlated with their HDL cholesterol concentration (r = -0.523, P = 0.022). Recipients also exhibited higher apoB-100/apoA1 ratios (0.6 ± 0.2 vs. 0.4 ± 0.1, P = 0.005). In addition, elevated triglyceride levels were still observed after one year of pravastatin use (1.3 [0.5-3.5] mmol/L vs. 0.7 [0.3-2.4] mmol/L, P = 0.0004). Clinically significant elevations in alanine aminotransferase, creatine kinase, or creatinine ocurred in neither group. Conclusions. Immunosuppressive medication considerably increased the plasma pravastatin concentrations. In both patient groups, pravastatin treatment was moderately effective, safe, and well tolerated. In the HeFH group, high baseline cholesterol absorption seemed to predispose patients to insufficient cholesterol-lowering efficacy of pravastatin. In the cardiac transplant recipients, low HDL cholesterol and a high apoB-100/apoA-I ratio were associated with development of TxCAD. Even though pravastatin in the transplant recipients effectively lowered serum total and LDL cholesterol concentrations, it failed to normalize their elevated triglyceride levels and, in some patients, to prevent the progression of TxCAD.
Resumo:
A total of 177 patients with primary dislocation of the patella (PDP) were admitted to two trauma centers in Helsinki, Finland during 1991 to 1992. The inclusion criteria were: 1. Acute (≤14 days old) first-time lateral dislocation of the patella. 2. No previous knee operations or major knee injuries. 3. No ligament injuries to be repaired. 4. No osteochondral fractures requiring fixation. 50 patients were excluded. 30 of these excluded patients would have met the inclusion criteria, 19 patients received treatment by consultants not involved in the study, 7 refused to participate and 4 had an erroneous randomization. 127 patients including, 82 females, were then randomized to have either tailor-made operative procedure (group O) or conservative treatment (group C). The aftercare was similar for both groups. The mean age of the patients was 20 (9-47) years. All patients were subjected to analysis of trauma history (starting position and knee movement during the dislocation), examination under anesthesia (EUA) and arthroscopy. 70 patients (52 females) were randomized by their odd year of birth to operative group O and 57 patients (30 females) by their even year of birth to conservative group C. The diagnosis of PDP was based on locked dislocation in 68 patients, on dislocatability in EUA in 47 patients, and on subluxation in EUA combined with typical intra-articular lesions in 12 patients. In group O, 63 patients had exploration of the injuries on the medial side of the knee and tailor made reconstruction added with lateral release in 54 cases. The medial injury was operated by suturing in 39 patients, by duplication in 18 patients and by additional augmentation of the medial patellofemoral ligament (MPFL) with adductor magnus tenodesis in 6 patients. 7 patients, without locking in trauma history and only subluxation in EUA had only lateral release for realignment. In adductor magnus tenodesis the proximal end of the distal tendinous part was rerouted to the upper medial border of the patella. In the conservative group C, the treatment was adjusted to the extent of patellar displacement in EUA. Patients with dislocation in EUA had 3 weeks’ immobilization with the knee in slight flexion. Mobilization was started with a soft patellar stabilizing orthosis (PSO) used for additional three weeks. The patients with subluxation in EUA wore an orthosis for six weeks. The aftercare was similar in group O. The outcome was similar in both groups. After an average of 25 (20-45) months´ follow-up, the subjective result was better in group C in respect of the mean Hughston VAS knee score (87 for group O and 90 for group C, p=0.04, visual analog scale), but similar in terms of the patient’s own overall opinion and the mean Lysholm II knee score. Recurrent instability episodes occurred in 18 patients in group O and in 20 patients in group C. After an average of 7 (6-9) years´ follow-up, the groups did not show statistical difference either in respect of the patient’s own overall opinion, or the mean Hughston VAS and Kujala knee scores. The proportions of stable patellae was 25/70 (36%) in group O and 17/57 (30%) in group O (p=0.5). In a multivariate risk analysis, there was a correlation between low Kujala score (<90) as dependent parameter and female gender (OR: 3.5; 95% CI: 1.4-9.0), and loose body on primary radiographs (OR: 4.1; 95% CI: 1.2-15). Recurrent instability correlated with young age at the time of PDP (OR: 0.9; 95% CI: 0.8-1.0/year). Girls with open tibial apophysis had the worst prognosis for instability (88%; 95% CI: 77-98). The most common mechanisms in trauma history of the patients were movement to flexion from a straight start (78%) and movement to extension from a well-bent start (8%). Spontaneous relocation of the patella had taken place in 13/39 of girls, in 11/21 of boys, in 26/42 of women and in 17/24 of men with skeletal maturity of the tibia. The dislocation in EUA was non-rotating in 96/126 patients followed by outward rotating dislocation in 14/126 patients. Operative treatment policy in PDP is not recommended. Locking tendency of the patella in PDP depended on the skeletal maturation. Recurrence rate after PDP was higher than expected.
Resumo:
The TOTEM experiment at the LHC will measure the total proton-proton cross-section with a precision better than 1%, elastic proton scattering over a wide range in momentum transfer -t= p^2 theta^2 up to 10 GeV^2 and diffractive dissociation, including single, double and central diffraction topologies. The total cross-section will be measured with the luminosity independent method that requires the simultaneous measurements of the total inelastic rate and the elastic proton scattering down to four-momentum transfers of a few 10^-3 GeV^2, corresponding to leading protons scattered in angles of microradians from the interaction point. This will be achieved using silicon microstrip detectors, which offer attractive properties such as good spatial resolution (<20 um), fast response (O(10ns)) to particles and radiation hardness up to 10^14 "n"/cm^2. This work reports about the development of an innovative structure at the detector edge reducing the conventional dead width of 0.5-1 mm to 50-60 um, compatible with the requirements of the experiment.
Resumo:
This thesis focuses on the issue of testing sleepiness quantitatively. The issue is relevant to policymakers concerned with traffic- and occupational safety; such testing provides a tool for safety legislation and -surveillance. The findings of this thesis provide guidelines for a posturographic sleepiness tester. Sleepiness ensuing from staying awake merely 17 h impairs our performance as much as the legally proscribed blood alcohol concentration 0.5 does. Hence, sleepiness is a major risk factor in transportation and occupational accidents. The lack of convenient, commercial sleepiness tests precludes testing impending sleepiness levels contrary to simply breath testing for alcohol intoxication. Posturography is a potential sleepiness test, since clinical diurnal balance testing suggests the hypothesis that time awake could be posturographically estimable. Relying on this hypothesis this thesis examines posturographic sleepiness testing for instrumentation purposes. Empirical results from 63 subjects for whom we tested balance with a force platform during wakefulness for maximum 36 h show that sustained wakefulness impairs balance. The results show that time awake is posturographically estimable with 88% accuracy and 97% precision which validates our hypothesis. Results also show that balance scores tested at 13:30 hours serve as a threshold to detect excessive sleepiness. Analytical results show that the test length has a marked effect on estimation accuracy: 18 s tests suffice to identify sleepiness related balance changes, but trades off some of the accuracy achieved with 30 s tests. The procedure to estimate time awake relies on equating the subject s test score to a reference table (comprising balance scores tested during sustained wakefulness, regressed against time awake). Empirical results showed that sustained wakefulness explains 60% of the diurnal balance variations, whereas the time of day explains 40% of the balance variations. The latter fact implies that time awake estimations also must rely on knowing the local times of both test and reference scores.
Resumo:
The analysis uses data from an integrated luminosity of approximately 172 pb-1 of ppbar collisions at sqrt(s)=1.96 TeV, collected with the CDF II detector at the Fermilab Tevatron. The Lambda_b and B0 relative branching fractions are measured to be: B(Lambda_b to Lambda_c+ mu nu)/B(Lambda_b to Lambda_c+ pi) = 16.6 +- 3.0 (stat) +- 1.0 (syst) +2.6 -3.4 (PDG) +- 0.3 (EBR), B(B0 to D+ mu nu)/B(B0 to D+ pi) = 9.9 +- 1.0 (stat) +- 0.6 (syst) +- 0.4 (PDG) +- 0.5 (EBR), B(B0 to D*+ mu nu)/B(B0 to D*+ pi) = 16.5 +- 2.3 (stat) +- 0.6 (syst) +- 0.5 (PDG) +- 0.8 (EBR) This article also presents measurements of the branching fractions of four new Lambda_b semileptonic decays: Lambda_b to Lambda_c(2595)+ mu nu, Lambda_b to Lambda_c(2625)+ mu nu, Lambda_b to Sigma_c(2455)0 pi mu nu, Lambda_b to Sigma_c(2455)++ pi mu nu, relative to the branching fraction of the Lambda_b to Lambda_c mu nu decay. Finally, the transverse-momentum distribution of Lambda_b baryons produced in p-pbar collisions is measured and found to be significantly different from that of B0 mesons.
Resumo:
We present a measurement of the top quark mass and of the top-antitop pair production cross section using p-pbar data collected with the CDFII detector at the Tevatron Collider at the Fermi National Accelerator Laboratory and corresponding to an integrated luminosity of 2.9 fb-1. We select events with six or more jets satisfying a number of kinematical requirements imposed by means of a neural network algorithm. At least one of these jets must originate from a b quark, as identified by the reconstruction of a secondary vertex inside the jet. The mass measurement is based on a likelihood fit incorporating reconstructed mass distributions representative of signal and background, where the absolute jet energy scale (JES) is measured simultaneously with the top quark mass. The measurement yields a value of 174.8 +- 2.4(stat+JES) ^{+1.2}_{-1.0}(syst) GeV/c^2, where the uncertainty from the absolute jet energy scale is evaluated together with the statistical uncertainty. The procedure measures also the amount of signal from which we derive a cross section, sigma_{ttbar} = 7.2 +- 0.5(stat) +- 1.0 (syst) +- 0.4 (lum) pb, for the measured values of top quark mass and JES.