942 resultados para BLOOD SERUM
Resumo:
Infection is a major cause of mortality and morbidity after thoracic organ transplantation. The aim of the present study was to evaluate the infectious complications after lung and heart transplantation, with a special emphasis on the usefulness of bronchoscopy and the demonstration of cytomegalovirus (CMV), human herpes virus (HHV)-6, and HHV-7. We reviewed all the consecutive bronchoscopies performed on heart transplant recipients (HTRs) from May 1988 to December 2001 (n = 44) and lung transplant recipients (LTRs) from February 1994 to November 2002 (n = 472). To compare different assays in the detection of CMV, a total of 21 thoracic organ transplant recipients were prospectively monitored by CMV pp65-antigenemia, DNAemia (PCR), and mRNAemia (NASBA) tests. The antigenemia test was the reference assay for therapeutic intervention. In addition to CMV antigenemia, 22 LTRs were monitored for HHV-6 and HHV-7 antigenemia. The diagnostic yield of the clinically indicated bronchoscopies was 41 % in the HTRs and 61 % in the LTRs. The utility of the bronchoscopy was highest from one to six months after transplantation. In contrast, the findings from the surveillance bronchoscopies performed on LTRs led to a change in the previous treatment in only 6 % of the cases. Pneumocystis carinii and CMV were the most commonly detected pathogens. Furthermore, 15 (65 %) of the P. carinii infections in the LTRs were detected during chemoprophylaxis. None of the complications of the bronchoscopies were fatal. Antigenemia, DNAemia, and mRNAemia were present in 98 %, 72 %, and 43 % of the CMV infections, respectively. The optimal DNAemia cut-off levels (sensitivity/specificity) were 400 (75.9/92.7 %), 850 (91.3/91.3 %), and 1250 (100/91.5 %) copies/ml for the antigenemia of 2, 5, and 10 pp65-positive leukocytes/50 000 leukocytes, respectively. The sensitivities of the NASBA were 25.9, 43.5, and 56.3 % in detecting the same cut-off levels. CMV DNAemia was detected in 93 % and mRNAemia in 61 % of the CMV antigenemias requiring antiviral therapy. HHV-6, HHV-7, and CMV antigenemia was detected in 20 (91 %), 11 (50 %), and 12 (55 %) of the 22 LTRs (median 16, 31, and 165 days), respectively. HHV-6 appeared in 15 (79 %), HHV-7 in seven (37 %), and CMV in one (7 %) of these patients during ganciclovir or valganciclovir prophylaxis. One case of pneumonitis and another of encephalitis were associated with HHV-6. In conclusion, bronchoscopy is a safe and useful diagnostic tool in LTRs and HTRs with a suspected respiratory infection, but the role of surveillance bronchoscopy in LTRs remains controversial. The PCR assay acts comparably with the antigenemia test in guiding the pre-emptive therapy against CMV when threshold levels of over 5 pp65-antigen positive leukocytes are used. In contrast, the low sensitivity of NASBA limits its usefulness. HHV-6 and HHV-7 activation is common after lung transplantation despite ganciclovir or valganciclovir prophylaxis, but clinical manifestations are infrequently linked to them.
Resumo:
Several studies link the consumption of whole-grain products to a lowered risk of chronic diseases, such as certain types of cancer, type II diabetes, and cardiovascular diseases. However, the final conclusions of the exact protective mechanisms remain unclear, partly due to a lack of a suitable biomarker for the whole-grain cereals intake. Alkylresorcinols (AR) are phenolic lipids abundant in the outer parts of wheat and rye grains usually with homologues of C15:0- C25:0 alkyl chains, and are suggested to function as whole-grain biomarkers. Mammalian lignan enterolactone has also previously been studied as a potential whole-grain biomarker. In the present work a quantified gas chromatography-mass spectrometry method for the analysis of AR in plasma, erythrocytes, and lipoproteins was developed. The method was used to determine human and pig plasma AR concentrations after the intake of whole-grain wheat and rye products compared to low-fibre wheat bread diets to assess the usability of AR as biomarkers of whole-grain intake. AR plasma concentrations were compared to serum ENL concentrations. AR absorption and elimination kinetics were investigated in a pig model. AR occurrence in human erythrocyte membranes and plasma lipoproteins were determined, and the distribution of AR in blood was evaluated. Plasma AR seem to be absorbed via the lymphatic system from the small intestine, like many other lipophilic compounds. Their apparent elimination half-life is relatively short and is similar to that of tocopherols, which have a similar chemical structure. Plasma AR concentrations increased significantly after a one- to eight-week intake of whole-grain wheat and further on with whole-grain rye bread. The concentrations were also higher after habitual Finnish diet compared to diet with low-fibre bread. Inter-individual variation after a one-week intake of the same amount of bread was high, but the mean plasma AR concentrations increased with increasing AR intake. AR are incorporated into erythrocyte membranes and plasma lipoproteins, and VLDL and HDL were the main AR carriers in human plasma. Based on these studies, plasma AR could function as specific biomarkers of dietary whole-grain products. AR are exclusively found in whole-grains and are more suitable as specific biomarkers of whole-grain intake than previously investigated mammalian lignan enterolactone, that is formed from several plants other than cereals in the diet. Plasma AR C17:0/C21:0 -ratio could distinguish whether whole-grain products in the diet are mainly wheat or rye. AR could be used in epidemiological studies to determine whole-grain intake and to better assess the role of whole-grains in disease prevention.
Resumo:
Atopic dermatitis (AD) or atopic eczema is characterised by a superficial skin inflammation with an overall Th2 cell dominance and impaired function of the epidermal barrier. Patients also are at an increased risk for asthma and allergic rhinitis. Treatment with tacrolimus ointment inhibits T cell activation and blocks the production of several inflammatory cytokines in the skin, without suppressing collagen synthesis. The aims of this thesis were to determine: (1) long-term efficacy, safety, and effects on cell-mediated immunity and serum IgE levels in patients with moderate-to-severe AD treated for 1 year with tacrolimus ointment or a corticosteroid regimen, (2) the 10-year outcome of eczema, respiratory symptoms, and serum IgE levels in AD patients initially treated long-term with tacrolimus ointment, and (3) pharmacokinetics and long-term safety and efficacy of 0.03% tacrolimus ointment in infants under age 2 with AD. Cell-mediated immunity, reflecting Th1 cell reactivity, was measured by recall antigens and was at baseline lower in patients with AD compared to healthy controls. Treatment with either 0.1% tacrolimus ointment or a corticosteroid regimen for one year enhanced recall antigen reactivity. Transepidermal water loss (TEWL), an indicator of skin barrier function, decreased at months 6 and 12 in both tacrolimus- and corticosteroid-treated patients; TEWL for the head and neck was significantly lower in tacrolimus-treated patients. Patients in the 10-year open follow-up study showed a decrease in affected body surface area from a baseline 19.0% to a 10-year 1.6% and those with bronchial hyper-responsiveness at baseline showed an increase in the provocative dose of inhaled histamine producing a 15% decrease in FEV1, indicating less hyper-responsiveness. Respiratory symptoms (asthma and rhinitis) reported by the patient decreased in those with active symptoms at baseline. A good treatment response after one year of tacrolimus treatment predicted a good treatment response throughout the 10-year follow-up and a decrease in total serum IgE levels at the 10-year follow-up visit. The 2-week pharmacokinetic and the long-term study with 0.03% tacrolimus ointment showed good and continuous improvement of AD in the infants. Tacrolimus blood levels were throughout the study low and treatment well tolerated. This thesis underlines the importance of effective long-term topical treatment of AD. When the active skin inflammation decreases, cell-mediated immunity of the skin improves and a secondary marker for Th2 cell reactivity, total serum IgE, decreases. Respiratory symptoms seem to improve when the eczema area decreases. All these effects can be attributed to improvement of skin barrier function. One potential method to prevent a progression from AD to asthma and allergic rhinitis may be avoidance of early sensitisation through the skin, so early treatment of AD in infants is crucial. Long-term treatment with 0.03% tacrolimus ointment was effective and safe in infants over age 3 months.
Resumo:
The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.
Resumo:
The purpose of this study was to evaluate subjective food-related gastrointestinal symptoms and their relation to cow’s milk by determining the genotype of adult-type hypolactasia, measuring antibodies against milk protein, and screening the most common cause for secondary hypolactasia, namely coeliac disease. The whole study group comprised 1900 adults who gave a blood sample for the study when they attended a health care centre laboratory for various reasons. Of these 1885 (99%) completed a questionnaire on food-related gastrointestinal symptoms. Study No. I evaluated the prevalence of adult-type hypolactasia and its correlation to self-reported milk induced gastrointestinal symptoms. The testing for hypolactasia was done by determination of the C/T-13910 genotypes of the study subjects. The results show that patients with the C/C-13910 genotype associated with adult type hypolactasia consume less milk than those with C/T-13910 and T/T-13910 genotypes. Study No. II evaluated the prevalence and clinical characteristics of undiagnosed coeliac disease in the whole study population with transglutaminase and endomysium antibodies and their correlation with gastrointestinal symptoms. The prevalence of coeliac disease was 2 %, which is surprisingly high. Serum transglutaminase and endomysium antibodies are valuable tools for recognising an undiagnosed coeliac disease in outpatient clinics. In the study No. III the evaluation of milk protein IgE related hypersensitivity was carried out by stratifying all 756 study subjects with milk related problems and randomly choosing 100 age and sex matched controls with no such symptoms from the rest of the original study group. In the study No. IV 400 serum samples were randomly selected for analyzing milk protein related IgA and IgG antibodies and their correlation to milk related GI-symptoms. The measurement of milk protein IgA, IgE or IgG (studies No. III and IV) did not correlate clearly to milk induced symptoms and gave no clinically significant information; hence their measurement is not encouraged in outpatient clinics. In conclusion, adult type hypolactasia is often considered the reason for gastrointestinal symptoms in adults and determination of the C/T-13910 genotypes is a practical way of diagnosing adult type hypolactasia in an outpatient setting. Undiagnosed coeliac disease, should be actively screened and diagnosed in order to apply a gluten free diet and avoid the GI-symptoms and nutritional deficiencies. Cow’s milk hypersensitivity in the adult population is difficult to diagnose since the mechanism in which it is mediated is still unclear. Measuring of cow’s milk protein specific antibodies IgE, IgA or IgG do not correlate with subjective milk-related GI-symptoms.
Resumo:
The systemic autoinflammatory disorders are a group of rare diseases characterized by periodically recurring episodes of acute inflammation and a rise in serum acute phase proteins, but with no signs of autoimmunity. At present eight hereditary syndromes are categorized as autoinflammatory, although the definition has also occasionally been extended to other inflammatory disorders, such as Crohn s disease. One of the autoinflammatory disorders is the autosomally dominantly inherited tumour necrosis factor receptor-associated periodic syndrome (TRAPS), which is caused by mutations in the gene encoding the tumour necrosis factor type 1 receptor (TNFRSF1A). In patients of Nordic descent, cases of TRAPS and of three other hereditary fevers, hyperimmunoglobulinemia D with periodic fever syndrome (HIDS), chronic infantile neurologic, cutaneous and articular syndrome (CINCA) and familial cold autoinflammatory syndrome (FCAS), have been reported, TRAPS being the most common of the four. Clinical characteristics of TRAPS are recurrent attacks of high spiking fever, associated with inflammation of serosal membranes and joints, myalgia, migratory rash and conjunctivitis or periorbital cellulitis. Systemic AA amyloidosis may occur as a sequel of the systemic inflammation. The aim of this study was to investigate the genetic background of hereditary periodically occurring fever syndromes in Finnish patients, to explore the reliability of determining serum concentrations of soluble TNFRSF1A and metalloproteinase-induced TNFRSF1A shedding as helpful tools in differential diagnostics, as well as to study intracellular NF-κB signalling in an attempt to widen the knowledge of the pathomechanisms underlying TRAPS. Genomic sequencing revealed two novel TNFRSF1A mutations, F112I and C73R, in two Finnish families. F112I was the first TNFRSF1A mutation to be reported in the third extracellular cysteine-rich domain of the gene and C73R was the third novel mutation to be reported in a Finnish family, with only one other TNFRSF1A mutation having been reported in the Nordic countries. We also presented a differential diagnostic problem in a TRAPS patient, emphasizing for the clinician the importance of differential diagnostic vigiliance in dealing with rare hereditary disorders. The underlying genetic disease of the patient both served as a misleading factor, which possibly postponed arrival at the correct diagnosis, but may also have predisposed to the pathologic condition, which led to a critical state of the patient. Using a method of flow cytometric analysis modified for the use on fresh whole blood, we studied intracellular signalling pathways in three Finnish TRAPS families with the F112I, C73R and the previously reported C88Y mutations. Evaluation of TNF-induced phosphorylation of NF-κB and p38, revealed low phosphorylation profiles in nine out of ten TRAPS patients in comparison to healthy control subjects. This study shows that TRAPS is a diagnostic possibility in patients of Nordic descent, with symptoms of periodically recurring fever and inflammation of the serosa and joints. In particular in the case of a family history of febrile episodes, the possibility of TRAPS should be considered, if an etiology of autoimmune or infectious nature is excluded. The discovery of three different mutations in a population as small as the Finnish, reinforces the notion that the extracellular domain of TNFRSF1A is prone to be mutated at the entire stretch of its cysteine-rich domains and not only at a limited number of sites, suggesting the absence of a founder effect in TRAPS. This study also demonstrates the challenges of clinical work in differentiating the symptoms of rare genetic disorders from those of other pathologic conditions and presents the possibility of an autoinflammatory disorder as being the underlying cause of severe clinical complications. Furthermore, functional studies of fresh blood leukocytes show that TRAPS is often associated with a low NF-κB and p38 phosphorylation profile, although low phosphorylation levels are not a requirement for the development of TRAPS. The aberrant signalling would suggest that the hyperinflammatory phenotype of TRAPS is the result of compensatory NF-κB-mediated regulatory mechanisms triggered by a deficiency of the innate immune response.
Resumo:
Sjögren s syndrome (SS) is a common autoimmune disease affecting the lacrimal and salivary glands. SS is characterized by a considerable female predominance and a late age of onset, commonly at the time of adreno- and menopause. The levels of the androgen prohormone dehydroepiandrosterone-sulphate (DHEA-S) in the serum are lower in patients with SS than in age- and sex-matched healthy control subjects. The eventual systemic effects of low androgen levels in SS are not currently well understood. Basement membranes (BM) are specialized layers of extracellular matrix and are composed of laminin (LM) and type IV collagen matrix networks. BMs deliver messages to epithelial cells via cellular LM-receptors including integrins (Int) and Lutheran blood group antigen (Lu). The composition of BMs and distribution of LM-receptors in labial salivary glands (LSGs) of normal healthy controls and patients with SS was assessed. LMs have complex and highly regulated distribution in LSGs. LMs seem to have specific tasks in the dynamic regulation of acinar cell function. LM-111 is important for the normal acinar cell differentiation and its expression is diminished in SS. Also LM-211 and -411 seem to have some acinar specific functional tasks in LSGs. LM-311, -332 and -511 seem to have more general structure maintaining and supporting roles in LSGs and are relatively intact also in SS. Ints α3β1, α6β1, α6β4 and Lu seem to supply structural basis for the firm attachment of epithelial cells to the BM in LSGs. The expression of Ints α1β1 and α2β1 differed clearly from other LM-receptors in that they were found almost exclusively around the acini and intercalated duct cells in salivons suggesting some type of acinar cell compartment-specific or dominant function. Expression of these integrins was lower in SS compared to healthy controls suggesting that the LM-111 and -211-to-Int α1β1 and α2β1 interactions are defective in SS and are crucial to the maintenance of the acini in LSGs. DHEA/DHEA-S concentration in serum and locally in saliva of patients with SS seems to have effects on the salivary glands. These effects were first detected using the androgen-dependent CRISP-3 protein, the production and secretion of which were clearly diminished in SS. This might be due to the impaired function of the intracrine DHEA prohormone metabolizing machinery, which fails to successfully convert DHEA into its active metabolites in LSGs. The progenitor epithelial cells from the intercalated ductal area of LSGs migrate to the acinar compartment and then undergo a phenotype change into secretory acinar cells. This migration and phenotype change seem to be regulated by the LM-111-to-Int α1β1/Int α2β1 interactions. Lack of these interactions could be one factor limiting the normal remodelling process. Androgens are effective stimulators of Int α1β1 and α2β1 expression in physiologic concentrations. Addition of DHEA to the culture medium had effective stimulating effect on the Int α1β1 and α2β1 expression and its effect may be deficient in the LSGs of patients with SS.
Resumo:
Introduction: The pathogenesis of diabetic nephropathy remains a matter of debate, although strong evidence suggests that it results from the interaction between susceptibility genes and the diabetic milieu. The true pathogenetic mechanism remains unknown, but a common denominator of micro- and macrovascular complications may exist. Some have suggested that low-grade inflammation and activation of the innate immune system might play a synergistic role in the pathogenesis of diabetic nephropathy. Aims of the study: The present studies were undertaken to investigate whether low-grade inflammation, mannan-binding lectin (MBL) and α-defensin play a role, together with adiponectin, in patients with type 1 diabetes and diabetic nephropathy. Subjects and methods: This study is part of the ongoing Finnish Diabetic Nephropathy Study (FinnDiane). The first four cross-sectional substudies of this thesis comprised 194 patients with type 1 diabetes divided into three groups (normo-, micro-, and macroalbuminuria) according to their albumin excretion rate (AER). The fifth substudy aimed to determine whether baseline serum adiponectin plays a role in the development and progression of diabetic nephropathy. This follow-up study included 1330 patients with type 1 diabetes and a mean follow-up period of five years. The patients were divided into three groups depending on their AER at baseline. As a measure of low-grade inflammation, highly sensitive CRP (hsCRP) and α-defensin were measured with radio-immunoassay, and interleukin-6 (IL-6) with high- sensitivity enzyme immuno-assay. Mannan-binding lectin and adiponectin were determined with time-resolved immunofluorometric assays. The progression of albuminuria from one stage to the other served as a measure of the progression of diabetic nephropathy. Results: Low-grade inflammatory markers, MBL, adiponectin, and α-defensin were all associated with diabetic nephropathy, whereas MBL, adiponectin, and α-defensin per se were unassociated with low-grade inflammatory markers. AER was the only clinical variable independently associated with hsCRP. AER, HDL-cholesterol and the duration of diabetes were independently associated with IL-6. HbA1c was the only variable independently associated with MBL. The estimated glomerular filtration rate (eGFR), AER, and waist-to-hip ratio were independently associated with adiponectin. Systolic blood pressure, HDL-cholesterol, total cholesterol, age, and eGFR were all independently associated with α-defensin. In patients with macroalbuminuria, progression to end-stage renal disease (ESRD) was associated with higher baseline adiponectin concentrations. Discussion and conclusions: Low-grade inflammation, MBL, adiponectin, and defensin were all associated with diabetic nephropathy in these cross-sectional studies. In contrast however, MBL, adiponectin, and defensin were not associated with low-grade inflammatory markers per se. Nor was defensin associated with MBL, which may suggest that these different players function in a coordinated fashion during the deleterious process of diabetic nephropathy. The question of what causes low-grade inflammation in patients with type 1 diabetes and diabetic nephropathy, however, remains unanswered. We could observe in our study that glycemic control, an atherosclerotic lipid profile, and waist-to-hip ratio (WHR) were associated with low-grade inflammation in the univariate analysis, although in the multivariate analysis, only AER, HDL-cholesterol, and the duration of diabetes, as a measure of glycemic load, proved to be independently associated with inflammation. Notably, all these factors are modifiable with changes in lifestyle and/or with a targeted medication. In the follow-up study, elevated serum adiponectin levels at baseline predicted the progression from macroalbuminuria to ESRD independently of renal function at baseline. This observation does not preclude adiponectin as a favorable factor during the process of diabetic nephropathy, since the rise in serum adiponectin concentrations may remain a mechanism by which the body compensates for the demands created by the diabetic milieu.
Resumo:
Pediatric renal transplantation (TX) has evolved greatly during the past few decades, and today TX is considered the standard care for children with end-stage renal disease. In Finland, 191 children had received renal transplants by October 2007, and 42% of them have already reached adulthood. Improvements in treatment of end-stage renal disease, surgical techniques, intensive care medicine, and in immunosuppressive therapy have paved the way to the current highly successful outcomes of pediatric transplantation. In children, the transplanted graft should last for decades, and normal growth and development should be guaranteed. These objectives set considerable requirements in optimizing and fine-tuning the post-operative therapy. Careful optimization of immunosuppressive therapy is crucial in protecting the graft against rejection, but also in protecting the patient against adverse effects of the medication. In the present study, the results of a retrospective investigation into individualized dosing of immunosuppresive medication, based on pharmacokinetic profiles, therapeutic drug monitoring, graft function and histology studies, and glucocorticoid biological activity determinations, are reported. Subgroups of a total of 178 patients, who received renal transplants in 1988 2006 were included in the study. The mean age at TX was 6.5 years, and approximately 26% of the patients were <2 years of age. The most common diagnosis leading to renal TX was congenital nephrosis of the Finnish type (NPHS1). Pediatric patients in Finland receive standard triple immunosuppression consisting of cyclosporine A (CsA), methylprednisolone (MP) and azathioprine (AZA) after renal TX. Optimal dosing of these agents is important to prevent rejections and preserve graft function in one hand, and to avoid the potentially serious adverse effects on the other hand. CsA has a narrow therapeutic window and individually variable pharmacokinetics. Therapeutic monitoring of CsA is, therefore, mandatory. Traditionally, CsA monitoring has been based on pre-dose trough levels (C0), but recent pharmacokinetic and clinical studies have revealed that the immunosuppressive effect may be related to diurnal CsA exposure and blood CsA concentration 0-4 hours after dosing. The two-hour post-dose concentration (C2) has proved a reliable surrogate marker of CsA exposure. Individual starting doses of CsA were analyzed in 65 patients. A recommended dose based on a pre-TX pharmacokinetic study was calculated for each patient by the pre-TX protocol. The predicted dose was clearly higher in the youngest children than in the older ones (22.9±10.4 and 10.5±5.1 mg/kg/d in patients <2 and >8 years of age, respectively). The actually administered oral doses of CsA were collected for three weeks after TX and compared to the pharmacokinetically predicted dose. After the TX, dosing of CsA was adjusted according to clinical parameters and blood CsA trough concentration. The pharmacokinetically predicted dose and patient age were the two significant parameters explaining post-TX doses of CsA. Accordingly, young children received significantly higher oral doses of CsA than the older ones. The correlation to the actually administered doses after TX was best in those patients, who had a predicted dose clearly higher or lower (> ±25%) than the average in their age-group. Due to the great individual variation in pharmacokinetics standardized dosing of CsA (based on body mass or surface area) may not be adequate. Pre-Tx profiles are helpful in determining suitable initial CsA doses. CsA monitoring based on trough and C2 concentrations was analyzed in 47 patients, who received renal transplants in 2001 2006. C0, C2 and experienced acute rejections were collected during the post-TX hospitalization, and also three months after TX when the first protocol core biopsy was obtained. The patients who remained rejection free had slightly higher C2 concentrations, especially very early after TX. However, after the first two weeks also the trough level was higher in the rejection-free patients than in those with acute rejections. Three months after TX the trough level was higher in patients with normal histology than in those with rejection changes in the routine biopsy. Monitoring of both the trough level and C2 may thus be warranted to guarantee sufficient peak concentration and baseline immunosuppression on one hand and to avoid over-exposure on the other hand. Controlling of rejection in the early months after transplantation is crucial as it may contribute to the development of long-term allograft nephropathy. Recently, it has become evident that immunoactivation fulfilling the histological criteria of acute rejection is possible in a well functioning graft with no clinical sings or laboratory perturbations. The influence of treatment of subclinical rejection, diagnosed in 3-month protocol biopsy, to graft function and histology 18 months after TX was analyzed in 22 patients and compared to 35 historical control patients. The incidence of subclinical rejection at three months was 43%, and the patients received a standard rejection treatment (a course of increased MP) and/or increased baseline immunosuppression, depending on the severity of rejection and graft function. Glomerular filtration rate (GFR) at 18 months was significantly better in the patients who were screened and treated for subclinical rejection in comparison to the historical patients (86.7±22.5 vs. 67.9±31.9 ml/min/1.73m2, respectively). The improvement was most remarkable in the youngest (<2 years) age group (94.1±11.0 vs. 67.9±26.8 ml/min/1.73m2). Histological findings of chronic allograft nephropathy were also more common in the historical patients in the 18-month protocol biopsy. All pediatric renal TX patients receive MP as a part of the baseline immunosuppression. Although the maintenance dose of MP is very low in the majority of the patients, the well-known steroid-related adverse affects are not uncommon. It has been shown in a previous study in Finnish pediatric TX patients that steroid exposure, measured as area under concentration-time curve (AUC), rather than the dose correlates with the adverse effects. In the present study, MP AUC was measured in sixteen stable maintenance patients, and a correlation with excess weight gain during 12 months after TX as well as with height deficit was found. A novel bioassay measuring the activation of glucocorticoid receptor dependent transcription cascade was also employed to assess the biological effect of MP. Glucocorticoid bioactivity was found to be related to the adverse effects, although the relationship was not as apparent as that with serum MP concentration. The findings in this study support individualized monitoring and adjustment of immunosuppression based on pharmacokinetics, graft function and histology. Pharmacokinetic profiles are helpful in estimating drug exposure and thus identifying the patients who might be at risk for excessive or insufficient immunosuppression. Individualized doses and monitoring of blood concentrations should definitely be employed with CsA, but possibly also with steroids. As an alternative to complete steroid withdrawal, individualized dosing based on drug exposure monitoring might help in avoiding the adverse effects. Early screening and treatment of subclinical immunoactivation is beneficial as it improves the prospects of good long-term graft function.
Resumo:
Cord blood is a well-established alternative to bone marrow and peripheral blood stem cell transplantation. To this day, over 400 000 unrelated donor cord blood units have been stored in cord blood banks worldwide. To enable successful cord blood transplantation, recent efforts have been focused on finding ways to increase the hematopoietic progenitor cell content of cord blood units. In this study, factors that may improve the selection and quality of cord blood collections for banking were identified. In 167 consecutive cord blood units collected from healthy full-term neonates and processed at a national cord blood bank, mean platelet volume (MPV) correlated with the numbers of cord blood unit hematopoietic progenitors (CD34+ cells and colony-forming units); this is a novel finding. Mean platelet volume can be thought to represent general hematopoietic activity, as newly formed platelets have been reported to be large. Stress during delivery is hypothesized to lead to the mobilization of hematopoietic progenitor cells through cytokine stimulation. Accordingly, low-normal umbilical arterial pH, thought to be associated with perinatal stress, correlated with high cord blood unit CD34+ cell and colony-forming unit numbers. The associations were closer in vaginal deliveries than in Cesarean sections. Vaginal delivery entails specific physiological changes, which may also affect the hematopoietic system. Thus, different factors may predict cord blood hematopoietic progenitor cell numbers in the two modes of delivery. Theoretical models were created to enable the use of platelet characteristics (mean platelet volume) and perinatal factors (umbilical arterial pH and placental weight) in the selection of cord blood collections with high hematopoietic progenitor cell counts. These observations could thus be implemented as a part of the evaluation of cord blood collections for banking. The quality of cord blood units has been the focus of several recent studies. However, hemostasis activation during cord blood collection is scarcely evaluated in cord blood banks. In this study, hemostasis activation was assessed with prothrombin activation fragment 1+2 (F1+2), a direct indicator of thrombin generation, and platelet factor 4 (PF4), indicating platelet activation. Altogether three sample series were collected during the set-up of the cord blood bank as well as after changes in personnel and collection equipment. The activation decreased from the first to the subsequent series, which were collected with the bank fully in operation and following international standards, and was at a level similar to that previously reported for healthy neonates. As hemostasis activation may have unwanted effects on cord blood cell contents, it should be minimized. The assessment of hemostasis activation could be implemented as a part of process control in cord blood banks. Culture assays provide information about the hematopoietic potential of the cord blood unit. In processed cord blood units prior to freezing, megakaryocytic colony growth was evaluated in semisolid cultures with a novel scoring system. Three investigators analyzed the colony assays, and the scores were highly concordant. With such scoring systems, the growth potential of various cord blood cell lineages can be assessed. In addition, erythroid cells were observed in liquid cultures of cryostored and thawed, unseparated cord blood units without exogenous erythropoietin. This was hypothesized to be due to the erythropoietic effect of thrombopoietin, endogenous erythropoietin production, and diverse cell-cell interactions in the culture. This observation underscores the complex interactions of cytokines and supporting cells in the heterogeneous cell population of the thawed cord blood unit.
Resumo:
Osteoporosis is not only a disease of the elderly, but is increasingly diagnosed in chronically ill children. Children with severe motor disabilities, such as cerebral palsy (CP), have many risk factors for osteoporosis. Adults with intellectual disability (ID) are also prone to low bone mineral density (BMD) and increased fractures. This study was carried out to identify risk factors for low BMD and osteoporosis in children with severe motor disability and in adults with ID. In this study 59 children with severe motor disability, ranging in age from 5 to 16 years were evaluated. Lumbar spine BMD was measured with dual-energy x-ray absorptiometry. BMD values were corrected for bone size by calculating bone mineral apparent density (BMAD), and for bone age. The values were transformed into Z-scores by comparison with normative data. Spinal radiographs were assessed for vertebral morphology. Blood samples were obtained for biochemical parameters. Parents were requested to keep a food diary for three days. The median daily energy and nutrient intakes were calculated. Fractures were common; 17% of the children had sustained peripheral fractures and 25% had compression fractures. BMD was low in children; the median spinal BMAD Z-score was -1.0 (range -5.0 – +2.0) and the BMAD Z-score <-2.0 in 20% of the children. Low BMAD Z-score and hypercalciuria were significant risk factors for fractures. In children with motor disability, calcium intakes were sufficient, while total energy and vitamin D intakes were not. In the vitamin D intervention studies, 44 children and adolescents with severe motor disability and 138 adults with ID were studied. After baseline blood samples, the children were divided into two groups; those in the treatment group received 1000 IU peroral vitamin D3 five days a week for 10 weeks, and subjects in the control group continued with their normal diet. Adults with ID were allocated to receive either 800 IU peroral vitamin D3 daily for six months or a single intramuscular injection of 150 000 IU D3. Blood samples were obtained at baseline and after treatment. Serum concentrations of 25-OH-vitamin D (S-25-OHD) were low in all subgroups before vitamin D intervention: in almost 60% of children and in 77% of adults the S-25-OHD concentration was below 50 nmol/L, indicating vitamin D insufficiency. After vitamin D intervention, 19% of children and 42% adults who received vitamin D perorally and 12% of adults who received vitamin D intramuscularly had optimal S-25-OHD (>80 nmol/L). This study demonstrated that low BMD and peripheral and spinal fractures are common in children with severe motor disabilities. Vitamin D status was suboptimal in the majority of children with motor disability and adults with ID. Vitamin D insufficiency can be corrected with vitamin D supplements; the peroral dose should be at least 800 IU per day.