294 resultados para personal values


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Septic shock is a common killer in intensive care units (ICU). The most crucial issue concerning the outcome is the early and aggressive start of treatment aimed at normalization of hemodynamics and the early start of antibiotics during the very first hours. The optimal targets of hemodynamic treatment, or impact of hemodynamic treatment on survival after first resuscitation period are less known. The objective of this study was to evaluate different aspects of the hemodynamic pattern in septic shock with special attention to prediction of outcome. In particular components of early treatment and monitoring in the ICU were assessed. A total of 401 patients, 218 with septic shock and 192 with severe sepsis or septic shock were included in the study. The patients were treated in 24 Finnish ICUs during 1999-2005. 295 of the patients were included in the Finnish national epidemiologic Finnsepsis study. We found that the most important hemodynamic variables concerning the outcome were the mean arterial pressures (MAP) and lactate during the first six hours in ICU and the MAP and mixed venous oxygen saturation (SvO2) under 70% during first 48 hours. The MAP levels under 65 mmHg and SvO2 below 70% were the best predictive thresholds. Also the high central venous pressure (CVP) correlated to adverse outcome. We assessed the correlation and agreement of SvO2 and mean central venous oxygen saturation (ScvO2) in septic shock during first day in ICU. The mean SvO2 was below ScvO2 during early sepsis. Bias of difference was 4.2% (95% limits of agreement 8.1% to 16.5%) by Bland-Altman analysis. The difference between saturation values correlated significantly to cardiac index and oxygen delivery. Thus, the ScvO2 can not be used as a substitute of SvO2 in hemodynamic monitoring in ICU. Several biomarkers have been investigated for their ability to help in diagnosis or outcome prediction in sepsis. We assessed the predictive value of N-terminal pro brain natriuretic peptide (NT-proBNP) on mortality in severe sepsis or septic shock. The NT-proBNP levels were significantly higher in hospital nonsurvivors. The NT-proBNP 72 hrs after inclusion was independent predictor of hospital mortality. The acute cardiac load contributed to NTproBNP values at admission, but renal failure was the main confounding factor later. The accuracy of NT-proBNP, however, was not sufficient for clinical decision-making concerning the outcome prediction. The delays in start of treatment are associated to poorer prognosis in sepsis. We assessed how the early treatment guidelines were adopted, and what was the impact of early treatment on mortality in septic shock in Finland. We found that the early treatment was not optimal in Finnish hospitals and this reflected to mortality. A delayed initiation of antimicrobial agents was especially associated with unfavorable outcome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The outcome of the successfully resuscitated patient is mainly determined by the extent of hypoxic-ischemic cerebral injury, and hypothermia has multiple mechanisms of action in mitigating such injury. The present study was undertaken from 1997 to 2001 in Helsinki as a part of the European multicenter study Hypothermia after cardiac arrest (HACA) to test the neuroprotective effect of therapeutic hypothermia in patients resuscitated from out-of-hospital ventricular fibrillation (VF) cardiac arrest (CA). The aim of this substudy was to examine the neurological and cardiological outcome of these patients, and especially to study and develop methods for prediction of outcome in the hypothermia-treated patients. A total of 275 patients were randomized to the HACA trial in Europe. In Helsinki, 70 patients were enrolled in the study according to the inclusion criteria. Those randomized to hypothermia were actively cooled externally to a core temperature 33 ± 1ºC for 24 hours with a cooling device. Serum markers of ischemic neuronal injury, NSE and S-100B, were sampled at 24, 36, and 48 hours after CA. Somatosensory and brain stem auditory evoked potentials (SEPs and BAEPs) were recorded 24 to 28 hours after CA; 24-hour ambulatory electrocardiography recordings were performed three times during the first two weeks and arrhythmias and heart rate variability (HRV) were analyzed from the tapes. The clinical outcome was assessed 3 and 6 months after CA. Neuropsychological examinations were performed on the conscious survivors 3 months after the CA. Quantitative electroencephalography (Q-EEG) and auditory P300 event-related potentials were studied at the same time-point. Therapeutic hypothermia of 33ºC for 24 hours led to an increased chance of good neurological outcome and survival after out-of-hospital VF CA. In the HACA study, 55% of hypothermia-treated patients and 39% of normothermia-treated patients reached a good neurological outcome (p=0.009) at 6 months after CA. Use of therapeutic hypothermia was not associated with any increase in clinically significant arrhythmias. The levels of serum NSE, but not the levels of S-100B, were lower in hypothermia- than in normothermia-treated patients. A decrease in NSE values between 24 and 48 hours was associated with good outcome at 6 months after CA. Decreasing levels of serum NSE but not of S-100B over time may indicate selective attenuation of delayed neuronal death by therapeutic hypothermia, and the time-course of serum NSE between 24 and 48 hours after CA may help in clinical decision-making. In SEP recordings bilaterally absent N20 responses predicted permanent coma with a specificity of 100% in both treatment arms. Recording of BAEPs provided no additional benefit in outcome prediction. Preserved 24- to 48-hour HRV may be a predictor of favorable outcome in CA patients treated with hypothermia. At 3 months after CA, no differences appeared in any cognitive functions between the two groups: 67% of patients in the hypothermia and 44% patients in the normothermia group were cognitively intact or had only very mild impairment. No significant differences emerged in any of the Q-EEG parameters between the two groups. The amplitude of P300 potential was significantly higher in the hypothermia-treated group. These results give further support to the use of therapeutic hypothermia in patients with sudden out-of-hospital CA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Osteoporosis is a skeletal disorder characterized by compromised bone strength that predisposes to increased fracture risk. Childhood and adolescence are critical periods for bone mass gain. Peak bone mass is mostly acquired by the age of 18 years and is an important determinant of adult bone health and lifetime risk for fractures. Medications, especially glucocorticoids (GCs), chronic inflammation, decreased physical activity, hormonal deficiencies, delayed puberty, and poor nutrition may predispose children and adolescents with a chronic disease to impaired bone health. In this work, we studied overall bone health, the incidence and prevalence of fractures in children and adolescents who were treated for juvenile idiopathic arthritis (JIA) or had undergone solid organ transplantation. The first study cohort included 62 patients diagnosed with JIA and treated with GCs. The epidemiology of fractures after transplantation was investigated in 196 patients and a more detailed analysis of bone health determinants was performed on 40 liver (LTx) and 106 renal (RTx) transplantation patients. Bone mineral density (BMD) and vertebral morphology were assessed by dual-energy x-ray absorptiometry. Standard radiographs were obtained to detect vertebral fractures and to determine bone age; BMD values were adjusted for skeletal maturity. Our study showed that median BMD values were subnormal in all patient cohorts. The values were highest in patients with JIA and lowest in patients with LTx. Age at transplantation influenced BMD values in LTx but not RTx patients; BMD values were higher in patients who had LTx before the age of two years. BMD was lowest during the immediate posttransplantation years and increased subnormally during puberty. Delayed skeletal maturation was common in all patient groups. The prevalence of vertebral fractures ranged from 10% to 19% in the cohorts. Most of the fractures were asymptomatic and diagnosed only at screening. Vertebral fractures were most common in LTx patients. Vitamin D deficiency was common in all patient groups, and only 3% of patients with JIA and 25% of transplantation patients were considered to have adequate serum vitamin D levels. The total cumulative weight-adjusted dose of GC was not associated with BMD values in JIA or LTx patients. The combination of female gender and age over 15 years, parathyroid hormone concentration over 100 ng/L, and cumulative weight-adjusted methylprednisolone dose over 150 mg/kg during the three preceding years were found to be important predictors for low lumbar spine BMD in RTx patients. Based on the high prevalence of osteoporosis in the study cohorts more efforts should be put to prevention and early diagnosis of osteoporosis in these pediatric patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some perioperative clinical factors related to the primary cemented arthroplasty operation for osteoarthritis of the hip or knee joint are studied and discussed in this thesis. In a randomized, double-blind study, 39 patients were divided into two groups: one receiving tranexamic acid and the other not receiving it. Tranexamic acid was given in a dose of 10 mg/kg before the operation and twice thereafter, at 8-hour intervals. Total blood loss was smaller in the tranexamic acid group than in the control group. No thromboembolic complications were noticed. In a prospective, randomized study, 58 patients with hip arthroplasty and 39 patients with knee arthroplasty were divided into groups with postoperative closed-suction drainage and without drainage. There was no difference in healing of the wounds, postoperative blood transfusions, complications or range of motion. As a result of this study, the use of drains is no longer recommended. In a randomised study the effectiveness of a femoral nerve block (25 patients) was compared with other methods of pain control (24 patients) on the first postoperative day after total knee arthroplasty. The femoral block consisted of a single injection administered at patients´ bedside during the surgeon´s hospital rounds. Femoral block patients reported less pain and required half of the amount of oxycodone. Additional femoral block or continued epidural analgesia was required more frequently by the control group patients. Pain management with femoral blocks resulted in less work for nursing staff. In a retrospective study of 422 total hip and knee arthroplasty cases the C-reactive protein levels and clinical course were examined. After hip and knee arthroplasty the maximal C-reactive protein values are seen on the second and third postoperative days, after which the level decreases rapidly. There is no difference between patients with cemented or uncemented prostheses. Major postoperative complications may cause a further increase in C-reactive protein levels at one and two weeks. In-hospital and outpatient postoperative control radiographs of 200 hip and knee arthroplasties were reviewed retrospectively. If postoperative radiographs are of good quality, there seems to be no need for early repetitive radiographs. The quality and safety of follow-up is not compromised by limiting follow-up radiographs to those with clinical indications. Exposure of the patients and the staff to radiation is reduced. Reading of the radiographs by only the treating orthopaedic surgeon is enough. These factors may seem separate from each other, but linking them together may help the treating orthopaedic surgeon to adequate patient care strategy. Notable savings can be achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This clinical study focused on effects of childhood specific language impairment (SLI) on daily functioning in late life. SLI is a neurobiological disorder with genetic predisposition and manifests as poor language production or comprehension or both in a child with age-level non-verbal intelligence and no other known cause for deficient language development. The prevalence rate of around 7% puts it among the most prevalent developmental disorders in childhood. Negative long-term effects, such as problems in learning and behavior, are frequent. In follow-up studies the focus has seldom been on self-perception of daily functioning and participation, which are considered important in the International Classification of Functioning, Disability, and Health (ICF). To investigate the self-perceived aspects of everyday functioning in individuals with childhood receptive SLI compared with age- and gender-matched control populations, the 15D, 16D, and 17D health-related quality of life (HRQoL) questionnaires were applied. These generic questionnaires include 15, 16, and 17 dimensions, respectively, and give both a single index score and a profile with values on each dimension. Information on different life domains (rehabilitation, education, employment etc.) from each age-group was collected with separate questionnaires. The study groups comprised adults, adolescents (12-16 years), and pre-adolescents (8-11 years) who had received a diagnosis of receptive SLI and had been examined, usually before school age, at the Department of Phoniatrics of Helsinki University Central Hospital, where children with language deficits caused by various etiologies are examined and treated by a multidisciplinary team. The adult respondents included 33 subjects with a mean age of 34 years. Measured with 15D, the subjects perceived their HRQoL to be nearly as good as that of their controls, but on the dimensions of speech, usual activities, mental functioning, and distress they were significantly worse off. They significantly more often lived with their parents (19%) or were pensioned (26%) than the adult Finnish population on average. Adults with self-perceived problems in finding words and in remembering instructions, manifestations of persistent language impairment, showed inferior every day functioning to the rest of the study group. Of the adolescents and pre-adolescents, 48 and 51, respectively, responded. The majority in both groups had received special education or extra educational support at school. They all had attended speech therapy at some point; at the time of the study only one adolescent, but every third pre-adolescent still received speech therapy. The 16D score of the adolescent or the 17D score of the pre-adolescents did not differ from that of their controls. The 16D profiles differed on some dimensions; subjects were significantly worse off on the dimension of mental functioning, but better off on the dimension of vitality than controls. Of the 17D dimensions, the study group was significantly worse off on speech, whereas the control group reported significantly more problems in sleeping. Of the childhood performance measures investigated, low verbal intelligence quotient (VIQ), which is often considered to reflect receptive language impairment, was in adults subjects significantly associated with some of the self-perceived problems, such as problems in usual activities and mental functioning. The 15D, 16D, and 17D questionnaires served well in measuring self-perceived HRQoL. Such standardized measures with population values are especially important in confirming with the ICF guidelines. In the future these questionnaires could perhaps be used on a more individual level in follow-up of children in clinics, and even in special schools and classes, to detect those children at greatest risk of negative long-term effects and perhaps diminished well-being regarding daily functioning and participation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A lack of suitable venous graft material or poor outflow is an increasingly encountered situation in peripheral vascular surgery. Prosthetic grafts have clearly worse patency than vein grafts in femorodistal bypass surgery. The use of an adjuvant arteriovenous fistula (av-fistula) at the distal anastomosis has been postulated to improve the flow and thus increase prosthetic graft patency. In theory the adjuvant fistula might have the same effect in a compromised outflow venous bypass. A free flap transfer also augments graft flow and may have a positive effect on an ischaemic limb. The aim of this study was to evaluate the possible benefit of an adjuvant av-fistula and an internal av-fistula within a free flap transfer on the patency and outcome of an infrapopliteal bypass. The effect of the av-fistula on bypass haemodynamics was also assessed along with possible adverse effects. Patients and methods: 1. A prospective randomised multicentre trial comprised 59 patients with critical leg ischaemia and no suitable veins for grafting. Femorocrural polytetrafluoroethylene (PTFE) bypasses with a distal vein cuff, with or without an adjuvant av-fistula, were performed. The outcome was assessed according to graft patency and leg salvage. 2. Haemodynamic measurements were performed to a total of 50 patients from Study I with a prolonged follow-up. 3. Nine critically ischaemic limbs were treated with a modified radial forearm flap transfer in combination with a femorodistal bypass operation. An internal av-fistula was created within the free flap transfer to increase flap artery and bypass graft flow. 4. The effect of a previous free flap transfer on bypass haemodynamics was studied in a case report. 5. In a retrospective multicentre case-control study, 77 infrapopliteal vein bypasses with an adjuvant av-fistula were compared with matched controls without a fistula. The outcome and haemodynamics of the bypasses were recorded. Main results: 1. The groups with and without the av-fistula did not differ as regards prosthetic graft patency or leg salvage. 2. The intra- and postoperative prosthetic graft flow was significantly increased in the patients with the av-fistula. However, this increase did not improve patency. There was no difference in patency between the groups, even in the extended follow-up. 3. The vein graft flow increased significantly after the anastomosis of the radial forearm flap with an internal av-fistula. 4. A previously performed free flap transfer significantly augmented the flow of a poor outflow femoropedal bypass graft. 5. The adjuvant av-fistula increased the venous infrapopliteal bypass flow significantly. The increased flow did not, however, lead to improved graft patency or leg salvage. Conclusion: An adjuvant av-fistula does not improve the patency of a femorocrural PTFE bypass with a distal vein cuff despite the fact that the flow values increased both in the intraoperative measurements and during the immediate postoperative surveillance. The adjuvant av-fistula increased graft flow significantly also in a poor outflow venous bypass, but regardless of this the outcome was no improved. The adjuvant av-fistula rarely caused adverse effects. In a group of diabetic patients, the flow in a vascular bypass graft was augmented by an internal av-fistula within a radial forearm flap and similarly in a patient with a previous free flap transfer, a high intraoperative graft flow was achieved due to the free flap shunt effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the diseases affecting public health, like hypertension, are multifactorial by etiology. Hypertension is influenced by genetic, life style and environmental factors. Estimation of the influence of genes to the risk of essential hypertension varies from 30 to 50%. It is plausible that in most of the cases susceptibility to hypertension is determined by the action of more than one gene. Although the exact molecular mechanism underlying essential hypertension remains obscure, several monogenic forms of hypertension have been identified. Since common genetic variations may predict, not only to susceptibility to hypertension, but also response to antihypertensive drug therapy, pharmacogenetic approaches may provide useful markers in finding relations between candidate genes and phenotypes of hypertension. The aim of this study was to identify genetic mutations and polymorphisms contributing to human hypertension, and examine their relationships to intermediate phenotypes of hypertension, such as blood pressure (BP) responses to antihypertensive drugs or biochemical laboratory values. Two groups of patients were investigated in the present study. The first group was collected from the database of patients investigated in the Hypertension Outpatient Ward, Helsinki University Central Hospital, and consisted of 399 subjects considered to have essential hypertension. Frequncies of the mutant or variant alleles were compared with those in two reference groups, healthy blood donors (n = 301) and normotensive males (n = 175). The second group of subjects with hypertension was collected prospectively. The study subjects (n=313) underwent a protocol lasting eight months, including four one-month drug treatment periods with antihypertensive medications (thiazide diuretic, β-blocker, calcium channel antagonist, and an angiotensin II receptor antagonist). BP responses and laboratory values were related to polymorphims of several candidate genes of the renin-angiotensin system (RAS). In addition, two patients with typical features of Liddle’s syndrome were screened for mutations in kidney epithelial sodium channel (ENaC) subunits. Two novel mutations causing Liddle’s syndrome were identified. The first mutation identified located in the beta-subunit of ENaC and the second mutation found located in the gamma-subunit, constituting the first identified Liddle mutation locating in the extracellular domain. This mutation showed 2-fold increase in channel activity in vitro. Three gene variants, of which two are novel, were identified in ENaC subunits. The prevalence of the variants was three times higher in hypertensive patients (9%) than in reference groups (3%). The variant carriers had increased daily urinary potassium excretion rate in relation to their renin levels compared with controls suggesting increased ENaC activity, although in vitro they did not show increased channel activity. Of the common polymorphisms of the RAS studied, angiotensin II receptor type I (AGTR1) 1166 A/C polymorphism was associated with modest changes in RAS activity. Thus, patients homozygous for the C allele tended to have increased aldosterone and decreased renin levels. In vitro functional studies using transfected HEK293 cells provided additional evidence that the AGTR1 1166 C allele may be associated with increased expression of the AGTR1. Common polymorphisms of the alpha-adducin and the RAS genes did not significantly predict BP responses to one-month monotherapies with hydroclorothiazide, bisoprolol, amlodipin, or losartan. In conclusion, two novel mutations of ENaC subunits causing Liddle’s syndrome were identified. In addition, three common ENaC polymorphisms were shown to be associated with occurrence of essential hypertension, but their exact functional and clinical consequences remain to be explored. The AGTR1 1166 C allele may modify the endocrine phenotype of hypertensive patients, when present in homozygous form. Certain widely studied polymorphisms of the ACE, angiotensinogen, AGTR1 and alpha-adducin genes did not significantly affect responses to a thiazide, β-blocker, calcium channel antagonist, and angiotensin II receptor antagonist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Head and neck squamous cell cancer (HNSCC) is the sixth most common cancer worldwide. Despite advances in combined modality therapy (surgery, radiotherapy, chemotherapy) the 5-year survival rate in stage III and IV disease remains at 40% - 60%. Short-range Auger-electron emitters, such as In-111 and In-114m, tagged with a drug, molecule, peptide, protein or nanoparticles brought in close proximity to nuclear DNA represent a fascinating alternative for treating cancer. In this thesis, we studied the usefulness of Indium-111-bleomycin complex (In-111-BLMC) in the diagnostics and potential therapy of HNSCC using in vitro HNSCC cell lines, in vivo nude mice, and in vivo HNSCC patients. In in vitro experiments with HNSCC cell lines, the sensitivity to external beam radiation, BLM, In-111-BLMC, and In-111-Cl3 was studied using the 96-well plate clonogenic assay. The influence of BLM and In-111-BLMC on the cell cycle was measured with flow cytometry. In in vivo nude mice xenograft studies, the activity ratios of In-111-BLMC were obtained in gamma camera images. The effect of In-111-BLMC in HNSCC xenografts was studied. In in vivo patient studies, we determined the tumor uptake of In-111-BLMC with gamma camera and the radioactivity from tumor samples using In-111-BLMC with specific activity of 75, 175, or 375 MBq/mg BLM. The S values, i.e. absorbed dose in a target organ per cumulated activity in a source organ, were simulated for In-111 and In-114m. In vitro studies showed the variation of sensitivity for external beam radiation, BLM, and In-111-BLMC between HNSCC cell lines. IC50 values for BLM were 1.6-, 1.8-, and 2.1-fold higher than In-111-BLMC (40 MBq/mg BLM) in three HNSCC cell lines. Specific In-111 activity of 40 MBq/mgBLM was more effective in killing cells than specific In-111 activity of 195MBq/mgBLM (p=0.0023). In-111-Cl3 alone had no killing effect. The percentage of cells in the G2/M phase increased after exposure to BLM and especially to In-111-BLMC in the three cell lines studied, indicating a G2/M block. The tumor-seeking behavior was shown in the in vivo imaging study of xenografted mice. BLM and In-111-BLMC were more effective than NaCl in reducing xenografted tumor size in HNSCC. The uptake ratios received from gamma images in the in vivo patient study varied from 1.2 to 2.8 in malignant tumors. However, the uptake of In-111-BLMC was unaffected by increasing the injected activity. A positive correlation existed between In-111-BLMC uptake, Ki-67/MIB activity, and number of mitoses. Regarding the S values, In-114m delivered a 4-fold absorbed radiation dose into the tumor compared with In-111, and thus, In-114m-BLMC might be more effective than In-111-BLMC at the DNA level. Auger-electron emitters, such as In-111 and In-114m, might have potential in the treatment of HNSCC. Further studies are needed to develop a radiopharmaceutical agent with appropriate physical properties of the radionuclide and a suitable carrier to bring it to the targeted tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Drugs and surgical techniques may have harmful renal effects during the perioperative period. Traditional biomarkers are often insensitive to minor renal changes, but novel biomarkers may more accurately detect disturbances in glomerular and tubular function and integrity. The purpose of this study was first, to evaluate the renal effects of ketorolac and clonidine during inhalation anesthesia with sevoflurane and isoflurane, and secondly, to evaluate the effect of tobacco smoking on the production of inorganic fluoride (F-) following enflurane and sevoflurane anesthesia as well as to determine the effect of F- on renal function and cellular integrity in surgical patients. A total of 143 patients undergoing either conventional (n = 75) or endoscopic (n = 68) inpatient surgery were enrolled in four studies. The ketorolac and clonidine studies were prospective, randomized, placebo controlled and double-blinded, while the cigarette smoking studies were prospective cohort studies with two parallel groups. As a sign of proximal tubular deterioration, a similar transient increase in urine N-acetyl-beta-D-glucosaminidase/creatinine (U-NAG/crea) was noted in both the ketorolac group and in the controls (baseline vs. at two hours of anesthesia, p = 0.015) with a 3.3 minimum alveolar concentration hour sevoflurane anesthesia. Uncorrelated U-NAG increased above the maximum concentration measured from healthy volunteers (6.1 units/l) in 5/15 patients with ketorolac and in none of the controls (p = 0.042). As a sign of proximal tubular deterioration, U-glutathione transferase-alpha/crea (U-GST-alpha/crea) increased in both groups at two hours after anesthesia but a more significant increase was noted in the patients with ketorolac. U-GST-alpha/crea increased above the maximum ratio measured from healthy volunteers in 7/15 patients with ketorolac and in 3/15 controls. Clonidine diminished the activation of the renin-angiotensin aldosterone system during pneumoperitoneum; urine output was better preserved in the patients treated with clonidine (1/15 patients developed oliguria) than in the controls (8/15 developed oliguria (p=0.005)). Most patients with pneumoperitoneum and isoflurane anesthesia developed a transient proximal tubular deterioration, as U-NAG increased above 6.1 units/L in 11/15 patients with clonidine and in 7/15 controls. In the patients receiving clonidine treatment, the median of U-NAG/crea was higher than in the controls at 60 minutes of pneumoperitoneum (p = 0.01), suggesting that clonidine seems to worsen proximal tubular deterioration. Smoking induced the metabolism of enflurane, but the renal function remained intact in both the smokers and the non-smokers with enflurane anesthesia. On the contrary, smoking did not induce sevoflurane metabolism, but glomerular function decreased in 4/25 non-smokers and in 7/25 smokers with sevoflurane anesthesia. All five patients with S-F- ≥ 40 micromol/L, but only 6/45 with S-F- less than 40 micromol/L (p = 0.001), developed a S-tumor associated trypsin inhibitor concentration above 3 nmol/L as a sign of glomerular dysfunction. As a sign of proximal tubulus deterioration, U-beta 2-microglobulin increased in 2/5 patients with S-F- over 40 micromol/L compared to 2/45 patients with the highest S-F- less than 40 micromol/L (p = 0.005). To conclude, sevoflurane anesthesia may cause a transient proximal tubular deterioration which may be worsened by a co-administration of ketorolac. Clonidine premedication prevents the activation of the renin-angiotensin aldosterone system and preserves normal urine output, but may be harmful for proximal tubules during pneumoperitoneum. Smoking induces the metabolism of enflurane but not that of sevoflurane. Serum F- of 40 micromol/L or higher may induce glomerular dysfunction and proximal tubulus deterioration in patients with sevoflurane anesthesia. The novel renal biomarkers warrant further studies in order to establish reference values for surgical patients having inhalation anesthesia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background. Hyperlipidemia is a common concern in patients with heterozygous familial hypercholesterolemia (HeFH) and in cardiac transplant recipients. In both groups, an elevated serum LDL cholesterol level accelerates the development of atherosclerotic vascular disease and increases the rates of cardiovascular morbidity and mortality. The purpose of this study is to assess the pharmacokinetics, efficacy, and safety of cholesterol-lowering pravastatin in children with HeFH and in pediatric cardiac transplant recipients receiving immunosuppressive medication. Patients and Methods. The pharmacokinetics of pravastatin was studied in 20 HeFH children and in 19 pediatric cardiac transplant recipients receiving triple immunosuppression. The patients ingested a single 10-mg dose of pravastatin, and plasma pravastatin concentrations were measured up to 10/24 hours. The efficacy and safety of pravastatin (maximum dose 10 to 60 mg/day and 10 mg/day) up to one to two years were studied in 30 patients with HeFH and in 19 cardiac transplant recipients, respectively. In a subgroup of 16 HeFH children, serum non-cholesterol sterol ratios (102 x mmol/mol of cholesterol), surrogate estimates of cholesterol absorption (cholestanol, campesterol, sitosterol), and synthesis (desmosterol and lathosterol) were studied at study baseline (on plant stanol esters) and during combination with pravastatin and plant stanol esters. In the transplant recipients, the lipoprotein levels and their mass compositions were analyzed before and after one year of pravastatin use, and then compared to values measured from 21 healthy pediatric controls. The transplant recipients were grouped into patients with transplant coronary artery disease (TxCAD) and patients without TxCAD, based on annual angiography evaluations before pravastatin. Results. In the cardiac transplant recipients, the mean area under the plasma concentration-time curve of pravastatin [AUC(0-10)], 264.1 * 192.4 ng.h/mL, was nearly ten-fold higher than in the HeFH children (26.6 * 17.0 ng.h/mL). By 2, 4, 6, 12 and 24 months of treatment, the LDL cholesterol levels in the HeFH children had respectively decreased by 25%, 26%, 29%, 33%, and 32%. In the HeFH group, pravastatin treatment increased the markers of cholesterol absorption and decreased those of synthesis. High ratios of cholestanol to cholesterol were associated with the poor cholesterol-lowering efficacy of pravastatin. In cardiac transplant recipients, pravastatin 10 mg/day lowered the LDL cholesterol by approximately 19%. Compared with the patients without TxCAD, patients with TxCAD had significantly lower HDL cholesterol concentrations and higher apoB-100/apoA-I ratios at baseline (1.0 ± 0.3 mmol/L vs. 1.4 ± 0.3 mmol/L, P = 0.031; and 0.7 ± 0.2 vs. 0.5 ± 0.1, P = 0.034) and after one year of pravastatin use (1.0 ± 0.3 mmol/L vs. 1.4 ± 0.3 mmol/L, P = 0.013; and 0.6 ± 0.2 vs. 0.4 ± 0.1, P = 0.005). Compared with healthy controls, the transplant recipients exhibited elevated serum triglycerides at baseline (median 1.3 [range 0.6-3.2] mmol/L vs. 0.7 [0.3-2.4] mmol/L, P=0.0002), which negatively correlated with their HDL cholesterol concentration (r = -0.523, P = 0.022). Recipients also exhibited higher apoB-100/apoA1 ratios (0.6 ± 0.2 vs. 0.4 ± 0.1, P = 0.005). In addition, elevated triglyceride levels were still observed after one year of pravastatin use (1.3 [0.5-3.5] mmol/L vs. 0.7 [0.3-2.4] mmol/L, P = 0.0004). Clinically significant elevations in alanine aminotransferase, creatine kinase, or creatinine ocurred in neither group. Conclusions. Immunosuppressive medication considerably increased the plasma pravastatin concentrations. In both patient groups, pravastatin treatment was moderately effective, safe, and well tolerated. In the HeFH group, high baseline cholesterol absorption seemed to predispose patients to insufficient cholesterol-lowering efficacy of pravastatin. In the cardiac transplant recipients, low HDL cholesterol and a high apoB-100/apoA-I ratio were associated with development of TxCAD. Even though pravastatin in the transplant recipients effectively lowered serum total and LDL cholesterol concentrations, it failed to normalize their elevated triglyceride levels and, in some patients, to prevent the progression of TxCAD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The metabolic syndrome and type 1 diabetes are associated with brain alterations such as cognitive decline brain infarctions, atrophy, and white matter lesions. Despite the importance of these alterations, their pathomechanism is still poorly understood. This study was conducted to investigate brain glucose and metabolites in healthy individuals with an increased cardiovascular risk and in patients with type 1 diabetes in order to discover more information on the nature of the known brain alterations. We studied 43 20- to 45-year-old men. Study I compared two groups of non-diabetic men, one with an accumulation of cardiovascular risk factors and another without. Studies II to IV compared men with type 1 diabetes (duration of diabetes 6.7 ± 5.2 years, no microvascular complications) with non-diabetic men. Brain glucose, N-acetylaspartate (NAA), total creatine (tCr), choline, and myo-inositol (mI) were quantified with proton magnetic resonance spectroscopy in three cerebral regions: frontal cortex, frontal white matter, thalamus, and in cerebellar white matter. Data collection was performed for all participants during fasting glycemia and in a subgroup (Studies III and IV), also during a hyperglycemic clamp that increased plasma glucose concentration by 12 mmol/l. In non-diabetic men, the brain glucose concentration correlated linearly with plasma glucose concentration. The cardiovascular risk group (Study I) had a 13% higher plasma glucose concentration than the control group, but no difference in thalamic glucose content. The risk group thus had lower thalamic glucose content than expected. They also had 17% increased tCr (marker of oxidative metabolism). In the control group, tCr correlated with thalamic glucose content, but in the risk group, tCr correlated instead with fasting plasma glucose and 2-h plasma glucose concentration in the oral glucose tolerance test. Risk factors of the metabolic syndrome, most importantly insulin resistance, may thus influence brain metabolism. During fasting glycemia (Study II), regional variation in the cerebral glucose levels appeared in the non-diabetic subjects but not in those with diabetes. In diabetic patients, excess glucose had accumulated predominantly in the white matter where the metabolite alterations were also the most pronounced. Compared to the controls values, the white matter NAA (marker of neuronal metabolism) was 6% lower and mI (glia cell marker) 20% higher. Hyperglycemia is therefore a potent risk factor for diabetic brain disease and the metabolic brain alterations may appear even before any peripheral microvascular complications are detectable. During acute hyperglycemia (Study III), the increase in cerebral glucose content in the patients with type 1 diabetes was, dependent on brain region, between 1.1 and 2.0 mmol/l. An every-day hyperglycemic episode in a diabetic patient may therefore as much as double brain glucose concentration. While chronic hyperglycemia had led to accumulation of glucose in the white matter, acute hyperglycemia burdened predominantly the gray matter. Acute hyperglycemia also revealed that chronic fluctuation in blood glucose may be associated with alterations in glucose uptake or in metabolism in the thalamus. The cerebellar white matter appeared very differently from the cerebral (Study IV). In the non-diabetic men it contained twice as much glucose as the cerebrum. Diabetes had altered neither its glucose content nor the brain metabolites. The cerebellum seems therefore more resistant to the effects of hyperglycemia than is the cerebrum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Osteoporosis is not only a disease of the elderly, but is increasingly diagnosed in chronically ill children. Children with severe motor disabilities, such as cerebral palsy (CP), have many risk factors for osteoporosis. Adults with intellectual disability (ID) are also prone to low bone mineral density (BMD) and increased fractures. This study was carried out to identify risk factors for low BMD and osteoporosis in children with severe motor disability and in adults with ID. In this study 59 children with severe motor disability, ranging in age from 5 to 16 years were evaluated. Lumbar spine BMD was measured with dual-energy x-ray absorptiometry. BMD values were corrected for bone size by calculating bone mineral apparent density (BMAD), and for bone age. The values were transformed into Z-scores by comparison with normative data. Spinal radiographs were assessed for vertebral morphology. Blood samples were obtained for biochemical parameters. Parents were requested to keep a food diary for three days. The median daily energy and nutrient intakes were calculated. Fractures were common; 17% of the children had sustained peripheral fractures and 25% had compression fractures. BMD was low in children; the median spinal BMAD Z-score was -1.0 (range -5.0 – +2.0) and the BMAD Z-score <-2.0 in 20% of the children. Low BMAD Z-score and hypercalciuria were significant risk factors for fractures. In children with motor disability, calcium intakes were sufficient, while total energy and vitamin D intakes were not. In the vitamin D intervention studies, 44 children and adolescents with severe motor disability and 138 adults with ID were studied. After baseline blood samples, the children were divided into two groups; those in the treatment group received 1000 IU peroral vitamin D3 five days a week for 10 weeks, and subjects in the control group continued with their normal diet. Adults with ID were allocated to receive either 800 IU peroral vitamin D3 daily for six months or a single intramuscular injection of 150 000 IU D3. Blood samples were obtained at baseline and after treatment. Serum concentrations of 25-OH-vitamin D (S-25-OHD) were low in all subgroups before vitamin D intervention: in almost 60% of children and in 77% of adults the S-25-OHD concentration was below 50 nmol/L, indicating vitamin D insufficiency. After vitamin D intervention, 19% of children and 42% adults who received vitamin D perorally and 12% of adults who received vitamin D intramuscularly had optimal S-25-OHD (>80 nmol/L). This study demonstrated that low BMD and peripheral and spinal fractures are common in children with severe motor disabilities. Vitamin D status was suboptimal in the majority of children with motor disability and adults with ID. Vitamin D insufficiency can be corrected with vitamin D supplements; the peroral dose should be at least 800 IU per day.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Both maternal and fetal complications are increased in diabetic pregnancies. Although hypertensive complications are increased in pregnant women with pregestational diabetes, reports on hypertensive complications in women with gestational diabetes mellitus (GDM) have been contradictory. Congenital malformations and macrosomia are the main fetal complications in Type 1 diabetic pregnancies, whereas fetal macrosomia and birth trauma but not congenital malformations are increased in GDM pregnancies. Aims: To study the frequency of hypertensive disorders in gestational diabetes mellitus. To evaluate the risk of macrosomia and brachial plexus injury (Erb’s palsy) and the ability of the 2-hour glucose tolerance test (OGTT) combined with the 24-hour glucose profile to distinguish between low and high risks of fetal macrosomia among women with GDM. To evaluate the relationship between glycemic control and the risk of fetal malformations in pregnancies complicated by Type 1 diabetes mellitus. To assess the effect of glycemic control on the occurrence of preeclampsia and pregnancy-induced hypertension in Type 1 diabetic pregnancies. Subjects: A total of 986 women with GDM and 203 women with borderline glucose intolerance (one abnormal value in the OGTT) with a singleton pregancy, 488 pregnant women with Type 1 diabetes (691 pregnancies and 709 offspring), and 1154 pregnant non-diabetic women (1181 pregnancies and 1187 offspring) were investigated. Results: In a prospective study on 81 GDM patients the combined frequency of preeclampsia and PIH was higher than in 327 non-diabetic controls (19.8% vs 6.1%, p<0.001). On the other hand, in 203 women with only one abnormal value in the OGTT, the rate of hypertensive complications did not differ from that of the controls. Both GDM women and those with only one abnormal value in the OGTT had higher pre-pregnancy weights and BMIs than the controls. In a retrospective study involving 385 insulin-treated and 520 diet-treated GDM patients, and 805 non-diabetic control pregnant women, fetal macrosomia occurred more often in the insulin-treated GDM pregnancies (18.2%, p<0.001) than in the diet-treated GDM pregnancies (4.4%), or the control pregnancies (2.2%). The rate of Erb’s palsy in vaginally delivered infants was 2.7% in the insulin-treated group of women and 2.4% in the diet-treated women compared with 0.3% in the controls (p<0.001). The cesarean section rate was more than twice as high (42.3% vs 18.6%) in the insulin-treated GDM patients as in the controls. A major fetal malformation was observed in 30 (4.2%) of the 709 newborn infants in Type 1 diabetic pregnancies and in 10 (1.4%) of the 735 controls (RR 3.1, 95% CI 1.6–6.2). Even women whose levels of HbA1c (normal values less than 5.6%) were only slightly increased in early pregnancy (between 5.6 and 6.8%) had a relative risk of fetal malformation of 3.0 (95% CI 1.2–7.5). Only diabetic patients with a normal HbA1c level (<5.6%) in early pregnancy had the same low risk of fetal malformations as the controls. Preeclampsia was diagnosed in 12.8% and PIH in 11.4% of the 616 Type 1 diabetic women without diabetic nephropathy. The corresponding frequencies among the 854 control women were 2.7% (OR 5.2; 95% CI 3.3–8.4) for preeclampsia and 5.6% (OR 2.2, 95% CI 1.5–3.1) for PIH. Multiple logistic regression analysis indicated that glycemic control, nulliparity, diabetic retinopathy and duration of diabetes were statistically significant independent predictors of preeclampsia. The adjusted odds ratios for preeclampsia were 1.6 (95% CI 1.3–2.0) for each 1%-unit increment in the HbA1c value during the first trimester and 0.6 (95% CI 0.5–0.8) for each 1%-unit decrement during the first half of pregnancy. In contrast, changes in glycemic control during the second half of pregnancy did not alter the risk of preeclampsia. Conclusions: In type 1 diabetic pregnancies it is extremely important to achieve optimal glycemic control before pregnancy and maintain it throughout pregnancy in order to decrease the complication rates both in the mother and in her offspring. The rate of fetal macrosomia and birth trauma in GDM pregnancies, especially in the group of insulin-treated women, is still relatively high. New strategies for screening, diagnosing, and treatment of GDM must be developed in order to decrease fetal and neonatal complications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cyclosporine is an immunosuppressant drug with a narrow therapeutic index and large variability in pharmacokinetics. To improve cyclosporine dose individualization in children, we used population pharmacokinetic modeling to study the effects of developmental, clinical, and genetic factors on cyclosporine pharmacokinetics in altogether 176 subjects (age range: 0.36–20.2 years) before and up to 16 years after renal transplantation. Pre-transplantation test doses of cyclosporine were given intravenously (3 mg/kg) and orally (10 mg/kg), on separate occasions, followed by blood sampling for 24 hours (n=175). After transplantation, in a total of 137 patients, cyclosporine concentration was quantified at trough, two hours post-dose, or with dose-interval curves. One-hundred-four of the studied patients were genotyped for 17 putatively functionally significant sequence variations in the ABCB1, SLCO1B1, ABCC2, CYP3A4, CYP3A5, and NR1I2 genes. Pharmacokinetic modeling was performed with the nonlinear mixed effects modeling computer program, NONMEM. A 3-compartment population pharmacokinetic model with first order absorption without lag-time was used to describe the data. The most important covariate affecting systemic clearance and distribution volume was allometrically scaled body weight i.e. body weight**3/4 for clearance and absolute body weight for volume of distribution. The clearance adjusted by absolute body weight declined with age and pre-pubertal children (< 8 years) had an approximately 25% higher clearance/body weight (L/h/kg) than did older children. Adjustment of clearance for allometric body weight removed its relationship to age after the first year of life. This finding is consistent with a gradual reduction in relative liver size towards adult values, and a relatively constant CYP3A content in the liver from about 6–12 months of age to adulthood. The other significant covariates affecting cyclosporine clearance and volume of distribution were hematocrit, plasma cholesterol, and serum creatinine, explaining up to 20%–30% of inter-individual differences before transplantation. After transplantation, their predictive role was smaller, as the variations in hematocrit, plasma cholesterol, and serum creatinine were also smaller. Before transplantation, no clinical or demographic covariates were found to affect oral bioavailability, and no systematic age-related changes in oral bioavailability were observed. After transplantation, older children receiving cyclosporine twice daily as the gelatine capsule microemulsion formulation had an about 1.25–1.3 times higher bioavailability than did the younger children receiving the liquid microemulsion formulation thrice daily. Moreover, cyclosporine oral bioavailability increased over 1.5-fold in the first month after transplantation, returning thereafter gradually to its initial value in 1–1.5 years. The largest cyclosporine doses were administered in the first 3–6 months after transplantation, and thereafter the single doses of cyclosporine were often smaller than 3 mg/kg. Thus, the results suggest that cyclosporine displays dose-dependent, saturable pre-systemic metabolism even at low single doses, whereas complete saturation of CYP3A4 and MDR1 (P-glycoprotein) renders cyclosporine pharmacokinetics dose-linear at higher doses. No significant associations were found between genetic polymorphisms and cyclosporine pharmacokinetics before transplantation in the whole population for which genetic data was available (n=104). However, in children older than eight years (n=22), heterozygous and homozygous carriers of the ABCB1 c.2677T or c.1236T alleles had an about 1.3 times or 1.6 times higher oral bioavailability, respectively, than did non-carriers. After transplantation, none of the ABCB1 SNPs or any other SNPs were found to be associated with cyclosporine clearance or oral bioavailability in the whole population, in the patients older than eight years, or in the patients younger than eight years. In the whole population, in those patients carrying the NR1I2 g.-25385C–g.-24381A–g.-205_-200GAGAAG–g.7635G–g.8055C haplotype, however, the bioavailability of cyclosporine was about one tenth lower, per allele, than in non-carriers. This effect was significant also in a subgroup of patients older than eight years. Furthermore, in patients carrying the NR1I2 g.-25385C–g.-24381A–g.-205_-200GAGAAG–g.7635G–g.8055T haplotype, the bioavailability was almost one fifth higher, per allele, than in non-carriers. It may be possible to improve individualization of cyclosporine dosing in children by accounting for the effects of developmental factors (body weight, liver size), time after transplantation, and cyclosporine dosing frequency/formulation. Further studies are required on the predictive value of genotyping for individualization of cyclosporine dosing in children.