933 resultados para Abnormal Subgroups
Resumo:
Anterior cruciate ligament (ACL) tear is a common sports injury of the knee. Arthroscopic reconstruction using autogenous graft material is widely used for patients with ACL instability. The grafts most commonly used are the patellar and the hamstring tendons, by various fixation techniques. Although clinical evaluation and conventional radiography are routinely used in follow-up after ACL surgery, magnetic resonance imaging (MRI) plays an important role in the diagnosis of complications after ACL surgery. The aim of this thesis was to study the clinical outcome of patellar and hamstring tendon ACL reconstruction techniques. In addition, the postoperative appearance of the ACL graft was evaluated using several MRI sequences. Of the 175 patients who underwent an arthroscopically assisted ACL reconstruction, 99 patients were randomized into patellar tendon (n=51) or hamstring tendon (n=48) groups. In addition, 62 patients with hamstring graft ACL reconstruction were randomized into either cross-pin (n=31) or interference screw (n=31) fixation groups. Follow-up evaluation determined knee laxity, isokinetic muscle performance and several knee scores. Lateral and anteroposterior view radiographs were obtained. Several MRI sequences were obtained with a 1.5-T imager. The appearance and enhancement pattern of the graft and periligamentous tissue, and the location of bone tunnels were evaluated. After MRI, arthroscopy was performed on 14 symptomatic knees. The results revealed no significant differences in the 2-year outcome between the groups. In the hamstring tendon group, the average femoral and tibial bone tunnel diameter increased during 2 years follow-up by 33% and 23%, respectively. In the asymptomatic knees, the graft showed homogeneous and low signal intensity with periligamentous streaks of intermediate signal intensity on T2-weighted MR images. In the symptomatic knees, arthroscopy revealed 12 abnormal grafts and two meniscal tears, each with an intact graft. Among 3 lax grafts visible on arthroscopy, MRI showed an intact graft and improper bone tunnel placement. For diagnosing graft failure, all MRI findings combined gave a specificity of 90% and a sensitivity of 81%. In conclusion, all techniques appeared to improve patients' performance, and were therefore considered as good choices for ACL reconstruction. In follow-up, MRI permits direct evaluation of the ACL graft, the bone tunnels, and additional disorders of the knee. Bone tunnel enlargement and periligamentous tissue showing contrast enhancement were non-specific MRI findings that did not signify ACL deficiency. With an intact graft and optimal femoral bone tunnel placement, graft deficiency is unlikely, and the MRI examination should be carefully scrutinized for possible other causes for the patients symptoms.
Resumo:
Atrial fibrillation (AF) is the most common tachyarrhythmia and is associated with substantial morbidity, increased mortality and cost. The treatment modalities of AF have increased, but results are still far from optimal. More individualized therapy may be beneficial. Aiming for this calls improved diagnostics. Aim of this study was to find non-invasive parameters obtained during sinus rhythm reflecting electrophysiological patterns related to propensity to AF and particularly to AF occurring without any associated heart disease, lone AF. Overall 240 subjects were enrolled, 136 patients with paroxysmal lone AF and 104 controls (mean age 45 years, 75% males). Signal measurements were performed by non-invasive magnetocardiography (MCG) and by invasive electroanatomic mapping (EAM). High-pass filtering techniques and a new method based on a surface gradient technique were adapted to analyze atrial MCG signal. The EAM was used to elucidate atrial activation in patients and as a reference for MCG. The results showed that MCG mapping is an accurate method to detect atrial electrophysiologic properties. In lone paroxysmal AF, duration of the atrial depolarization complex was marginally prolonged. The difference was more obvious in women and was also related to interatrial conduction patterns. In the focal type of AF (75%), the root mean square (RMS) amplitudes of the atrial signal were normal, but in AF without demonstrable triggers the late atrial RMS amplitudes were reduced. In addition, the atrial characteristics tended to remain similar even when examined several years after the first AF episodes. The intra-atrial recordings confirmed the occurrence of three distinct sites of electrical connection from right to left atrium (LA): the Bachmann bundle (BB), the margin of the fossa ovalis (FO), and the coronary sinus ostial area (CS). The propagation of atrial signal could also be evaluated non-invasively. Three MCG atrial wave types were identified, each of which represented a distinct interatrial activation pattern. In conclusion, in paroxysmal lone AF, active focal triggers are common, atrial depolarization is slightly prolonged, but with a normal amplitude, and the arrhythmia does not necessarily lead to electrical or mechanical dysfunction of the atria. In women the prolongation of atrial depolarization is more obvious. This may be related to gender differences in presentation of AF. A significant minority of patients with lone AF lack frequent focal triggers, and in them, the late atrial signal amplitude is reduced, possibly signifying a wider degenerative process in the LA. In lone AF, natural impulse propagation to LA during sinus rhythm goes through one or more of the principal pathways described. The BB is the most common route, but in one-third, the earliest LA activation occurs outside the BB. Susceptibility to paroxysmal lone AF is associated with propagation of the atrial signal via the margin of the FO or via multiple pathways. When conduction occurs via the BB, it is related with prolonged atrial activation. Thus, altered and alternative conduction pathways may contribute to pathogenesis of lone AF. There is growing evidence of variability in genesis of AF also within lone paroxysmal AF. Present study suggests that this variation may be reflected in cardiac signal pattern. Recognizing the distinct signal profiles may assist in understanding the pathogenesis of AF and identifying subgroups for patient-tailored therapy.
Resumo:
Background: Irritable bowel syndrome (IBS) is a common functional gastrointestinal (GI) disorder characterised by abdominal pain and abnormal bowel function. It is associated with a high rate of healthcare consumption and significant health care costs. The prevalence and economic burden of IBS in Finland has not been studied before. The aims of this study were to assess the prevalence of IBS according to various diagnostic criteria and to study the rates of psychiatric and somatic comorbidity in IBS. In addition, health care consumption and societal costs of IBS were to be evaluated. Methods: The study was a two-phase postal survey. Questionnaire I identifying IBS by Manning 2 (at least two of the six Manning symptoms), Manning 3 (at least three Manning symptoms), Rome I, and Rome II criteria, was mailed to a random sample of 5 000 working age subjects. It also covered extra-GI symptoms such as headache, back pain, and depression. Questionnaire II, covering rates of physician visits, and use of GI medication, was sent to subjects fulfilling Manning 2 or Rome II IBS criteria in Questionnaire I. Results: The response rate was 73% and 86% for questionnaires I and II. The prevalence of IBS was 15.9%, 9.6%, 5.6%, and 5.1% according to Manning 2, Manning 3, Rome I, and Rome II criteria. Of those meeting Rome II criteria, 97% also met Manning 2 criteria. Presence of severe abdominal pain was more often reported by subjects meeting either of the Rome criteria than those meeting either of the Manning criteria. Presence of depression, anxiety, and several somatic symptoms was more common among subjects meeting any IBS criterion than by controls. Of subjects with depressive symptoms, 11.6% met Rome II IBS criteria compared to 3.7% of those with no depressiveness. Subjects meeting any IBS criteria made more physician visits than controls. Intensity of GI symptoms and presence of dyspeptic symptoms were the strongest predictors of GI consultations. Presence of dyspeptic symptoms and a history of abdominal pain in childhood also predicted non-GI visits. Annual GI related individual costs were higher in the Rome II group (497 ) than in the Manning 2 group (295 ). Direct expenses of GI symptoms and non GI physician visits ranged between 98M for Rome II and 230M for Manning 2 criteria. Conclusions: The prevalence of IBS varies substantially depending on the criteria applied. Rome II criteria are more restrictive than Manning 2, and they identify an IBS population with more severe GI symptoms, more frequent health care use, and higher individual health care costs. Subjects with IBS demonstrate high rates of psychiatric and somatic comorbidity regardless of health care seeking status. Perceived symptom severity rather than psychiatric comorbidity predicts health care seeking for GI symptoms. IBS incurs considerable medical costs. The direct GI and non-GI costs are equivalent to up to 5% of outpatient health care and medicine costs in Finland. A more integral approach to IBS by physicians, accounting also for comorbid conditions, may produce a more favourable course in IBS patients and reduce health care expenditures.
Resumo:
The project consisted of two long-term follow-up studies of preterm children addressing the question whether intrauterine growth restriction affects the outcome. Assessment at 5 years of age of 203 children with a birth weight less than 1000 g born in Finland in 1996-1997 showed that 9% of the children had cognitive impairment, 14% cerebral palsy, and 4% needed a hearing aid. The intelligence quotient was lower (p<0.05) than the reference value. Thus, 20% exhibited major, 19% minor disabilities, and 61% had no functional abnormalities. Being small for gestational age (SGA) was associated with sub-optimal growth later. In children born before 27 gestational weeks, the SGA had more neuropsychological disabilities than those appropriate for gestational age (AGA). In another cohort with birth weight less than 1500 g assessed at 5 years of age, echocardiography showed a thickened interventricular septum and a decreased left ventricular end-diastolic diameter in both SGA and AGA born children. They also had a higher systolic blood pressure than the reference. Laser-Doppler flowmetry showed different endothelium-dependent and -independent vasodilation responses in the AGA children compared to those of the controls. SGA was not associated with cardio-vascular abnormalities. Auditory event-related potentials (AERPs) were recorded using an oddball paradigm with frequency deviants (standard tone 500 Hz and deviant 750-Hz with 10% probability). At term, the P350 was smaller in SGA and AGA infants than in controls. At 12 months, the automatic change detection peak (mismatch negativity, MMN) was observed in the controls. However, the pre-term infants had a difference positivity that correlated with their neurodevelopment scores. At 5 years of age, the P1-deflection, which reflects primary auditory processing, was smaller, and the MMN larger in the preterm than in the control children. Even with a challenging paradigm or a distraction paradigm, P1 was smaller in the preterm than in the control children. The SGA and AGA children showed similar AERP responses. Prematurity is a major risk factor for abnormal brain development. Preterm children showed signs of cardiovascular abnormality suggesting that prematurity per se may carry a risk for later morbidity. The small positive amplitudes in AERPs suggest persisting altered auditory processing in the preterm in-fants.
Resumo:
Pediatric renal transplantation (TX) has evolved greatly during the past few decades, and today TX is considered the standard care for children with end-stage renal disease. In Finland, 191 children had received renal transplants by October 2007, and 42% of them have already reached adulthood. Improvements in treatment of end-stage renal disease, surgical techniques, intensive care medicine, and in immunosuppressive therapy have paved the way to the current highly successful outcomes of pediatric transplantation. In children, the transplanted graft should last for decades, and normal growth and development should be guaranteed. These objectives set considerable requirements in optimizing and fine-tuning the post-operative therapy. Careful optimization of immunosuppressive therapy is crucial in protecting the graft against rejection, but also in protecting the patient against adverse effects of the medication. In the present study, the results of a retrospective investigation into individualized dosing of immunosuppresive medication, based on pharmacokinetic profiles, therapeutic drug monitoring, graft function and histology studies, and glucocorticoid biological activity determinations, are reported. Subgroups of a total of 178 patients, who received renal transplants in 1988 2006 were included in the study. The mean age at TX was 6.5 years, and approximately 26% of the patients were <2 years of age. The most common diagnosis leading to renal TX was congenital nephrosis of the Finnish type (NPHS1). Pediatric patients in Finland receive standard triple immunosuppression consisting of cyclosporine A (CsA), methylprednisolone (MP) and azathioprine (AZA) after renal TX. Optimal dosing of these agents is important to prevent rejections and preserve graft function in one hand, and to avoid the potentially serious adverse effects on the other hand. CsA has a narrow therapeutic window and individually variable pharmacokinetics. Therapeutic monitoring of CsA is, therefore, mandatory. Traditionally, CsA monitoring has been based on pre-dose trough levels (C0), but recent pharmacokinetic and clinical studies have revealed that the immunosuppressive effect may be related to diurnal CsA exposure and blood CsA concentration 0-4 hours after dosing. The two-hour post-dose concentration (C2) has proved a reliable surrogate marker of CsA exposure. Individual starting doses of CsA were analyzed in 65 patients. A recommended dose based on a pre-TX pharmacokinetic study was calculated for each patient by the pre-TX protocol. The predicted dose was clearly higher in the youngest children than in the older ones (22.9±10.4 and 10.5±5.1 mg/kg/d in patients <2 and >8 years of age, respectively). The actually administered oral doses of CsA were collected for three weeks after TX and compared to the pharmacokinetically predicted dose. After the TX, dosing of CsA was adjusted according to clinical parameters and blood CsA trough concentration. The pharmacokinetically predicted dose and patient age were the two significant parameters explaining post-TX doses of CsA. Accordingly, young children received significantly higher oral doses of CsA than the older ones. The correlation to the actually administered doses after TX was best in those patients, who had a predicted dose clearly higher or lower (> ±25%) than the average in their age-group. Due to the great individual variation in pharmacokinetics standardized dosing of CsA (based on body mass or surface area) may not be adequate. Pre-Tx profiles are helpful in determining suitable initial CsA doses. CsA monitoring based on trough and C2 concentrations was analyzed in 47 patients, who received renal transplants in 2001 2006. C0, C2 and experienced acute rejections were collected during the post-TX hospitalization, and also three months after TX when the first protocol core biopsy was obtained. The patients who remained rejection free had slightly higher C2 concentrations, especially very early after TX. However, after the first two weeks also the trough level was higher in the rejection-free patients than in those with acute rejections. Three months after TX the trough level was higher in patients with normal histology than in those with rejection changes in the routine biopsy. Monitoring of both the trough level and C2 may thus be warranted to guarantee sufficient peak concentration and baseline immunosuppression on one hand and to avoid over-exposure on the other hand. Controlling of rejection in the early months after transplantation is crucial as it may contribute to the development of long-term allograft nephropathy. Recently, it has become evident that immunoactivation fulfilling the histological criteria of acute rejection is possible in a well functioning graft with no clinical sings or laboratory perturbations. The influence of treatment of subclinical rejection, diagnosed in 3-month protocol biopsy, to graft function and histology 18 months after TX was analyzed in 22 patients and compared to 35 historical control patients. The incidence of subclinical rejection at three months was 43%, and the patients received a standard rejection treatment (a course of increased MP) and/or increased baseline immunosuppression, depending on the severity of rejection and graft function. Glomerular filtration rate (GFR) at 18 months was significantly better in the patients who were screened and treated for subclinical rejection in comparison to the historical patients (86.7±22.5 vs. 67.9±31.9 ml/min/1.73m2, respectively). The improvement was most remarkable in the youngest (<2 years) age group (94.1±11.0 vs. 67.9±26.8 ml/min/1.73m2). Histological findings of chronic allograft nephropathy were also more common in the historical patients in the 18-month protocol biopsy. All pediatric renal TX patients receive MP as a part of the baseline immunosuppression. Although the maintenance dose of MP is very low in the majority of the patients, the well-known steroid-related adverse affects are not uncommon. It has been shown in a previous study in Finnish pediatric TX patients that steroid exposure, measured as area under concentration-time curve (AUC), rather than the dose correlates with the adverse effects. In the present study, MP AUC was measured in sixteen stable maintenance patients, and a correlation with excess weight gain during 12 months after TX as well as with height deficit was found. A novel bioassay measuring the activation of glucocorticoid receptor dependent transcription cascade was also employed to assess the biological effect of MP. Glucocorticoid bioactivity was found to be related to the adverse effects, although the relationship was not as apparent as that with serum MP concentration. The findings in this study support individualized monitoring and adjustment of immunosuppression based on pharmacokinetics, graft function and histology. Pharmacokinetic profiles are helpful in estimating drug exposure and thus identifying the patients who might be at risk for excessive or insufficient immunosuppression. Individualized doses and monitoring of blood concentrations should definitely be employed with CsA, but possibly also with steroids. As an alternative to complete steroid withdrawal, individualized dosing based on drug exposure monitoring might help in avoiding the adverse effects. Early screening and treatment of subclinical immunoactivation is beneficial as it improves the prospects of good long-term graft function.
Resumo:
The autonomic nervous system is an important modulator of ventricular repolarization and arrhythmia vulnerability. This study explored the effects of cardiovascular autonomic function tests on repolarization and its heterogeneity, with a special reference to congenital arrhythmogenic disorders typically associated with stress-induced fatal ventricular arrhythmias. The first part explored the effects of standardized autonomic tests on QT intervals in a 12-lead electrocardiogram and in multichannel magnetocardiography in 10 healthy adults. The second part studied the effects of deep breathing, Valsalva manouvre, mental stress, sustained handgrip and mild exercise on QT intervals in asymptomatic patients with LQT1 subtype of the hereditary long QT syndrome (n=9) and in patients with arrhythmogenic right ventricular dysplasia (ARVD, n=9). Even strong sympathetic activation had no effects on spatial QT interval dispersion in healthy subjects, but deep respiratory efforts and Valsalva influenced it in ways that were opposite in electrocardiographic and magnetocardiographic recordings. LQT1 patients showed blunted QT interval and sinus nodal responses to sympathetic challenge, as well as an exaggerated QT prolongation during the recovery phases. LQT1 patients showed a QT interval recovery overshoot in 2.4 ± 1.7 tests compared with 0.8 ± 0.7 in healthy controls (P = 0.02). Valsalva strain prolonged the T wave peak to T wave end interval only in the LQT1 patients, considered to reflect the arrhythmogenic substrate in this syndrome. ARVD patients showed signs of abnormal repolarization in the right ventricle, modulated by abrupt sympathetic activation. An electrocardiographic marker reflecting interventricular dispersion of repolarization was introduced. It showed that LQT1 patients exhibit a repolarization gradient from the left ventricle towards the right ventricle, significantly larger than in controls. In contrast, ARVD patients showed a repolarization gradient from the right ventricle towards the left. Valsalva strain amplified the repolarization gradient in LQT1 patients whereas it transiently reversed it in patients with ARVD. In conclusion, intrathoracic volume and pressure changes influence regional electrocardiographic and magnetocardiographic QT interval measurements differently. Especially recovery phases of standard cardiovascular autonomic functions tests and Valsalva manoeuvre reveal the abnormal repolarization in asymptomatic LQT1 patients. Both LQT1 and ARVD patients have abnormal interventricular repolarization gradients, modulated by abrupt sympathetic activation. Autonomic testing and in particular the Valsalva manoeuvre are potentially useful in unmasking abnormal repolarization in these syndromes.
Resumo:
Microstructure and microtexture evolution during static annealing of a hot-extruded AZ21 magnesium alloy was studied. Apart from fine recrystallized equiaxed grains and large elongated deformed grains, a new third kind of abnormal grains that are stacked one after the other in a row parallel to the extrusion direction were observed. The crystallographic misorientation inside these grains was similar to that of the fine recrystallized grains. The large elongated grains exhibited significant in-grain misorientation. A self-consistent mechanistic model was developed to describe the formation of these grain morphologies during dynamic recrystallization (DRX). The texture of pre-extruded material, although lost in DRX, leaves a unique signature which manifests itself in the form of these grain morphologies. The origin of abnormal stacked grains was associated with slow nucleation in pre-extruded grains of a certain orientation. Further annealing resulted in large secondary recrystallized grains with occasional extension twins. (c) 2009 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Resumo:
Unexpected swelling induced in foundation soils can cause distress to structures founded on them. In this paper, the swelling of kaolinitic soils due to interaction with alkali solution has been reported. The induced swelling is attributed to the formation of new minerals, which has been confirmed by X-ray diffraction patters and SEM studies. To understand the effect of alkali concentration and duration of interaction, two series of consolidation experiments have been carried out. In series 1, the specimen were remoulded with water and inundated with alkali solutions and in series 2, the specimen were remoulded and inundated with same alkali solutions. A steep compression during loading cycle and no abnormal swelling during unloading cycle has been noticed for the specimen remoulded with water and inundated with 1 N NaOH solutions. The steep compression is due to the segregation or break down of clay minerals due to alkali interactions. In case of specimen inundated with 4 N NaOH solutions, abnormal swelling has been observed during unloading cycle of the consolidation test. New minerals are formed on interaction of soil with 4 N solution as confirmed by X-ray diffraction patterns. These minerals are known to have very fine pores and possess high water holding capacity. The differences in the amount of swelling of samples remoulded with water and remoulded with alkali solution are due to variations in the concentration of alkali and duration of interaction.
Resumo:
Osteoporosis is not only a disease of the elderly, but is increasingly diagnosed in chronically ill children. Children with severe motor disabilities, such as cerebral palsy (CP), have many risk factors for osteoporosis. Adults with intellectual disability (ID) are also prone to low bone mineral density (BMD) and increased fractures. This study was carried out to identify risk factors for low BMD and osteoporosis in children with severe motor disability and in adults with ID. In this study 59 children with severe motor disability, ranging in age from 5 to 16 years were evaluated. Lumbar spine BMD was measured with dual-energy x-ray absorptiometry. BMD values were corrected for bone size by calculating bone mineral apparent density (BMAD), and for bone age. The values were transformed into Z-scores by comparison with normative data. Spinal radiographs were assessed for vertebral morphology. Blood samples were obtained for biochemical parameters. Parents were requested to keep a food diary for three days. The median daily energy and nutrient intakes were calculated. Fractures were common; 17% of the children had sustained peripheral fractures and 25% had compression fractures. BMD was low in children; the median spinal BMAD Z-score was -1.0 (range -5.0 – +2.0) and the BMAD Z-score <-2.0 in 20% of the children. Low BMAD Z-score and hypercalciuria were significant risk factors for fractures. In children with motor disability, calcium intakes were sufficient, while total energy and vitamin D intakes were not. In the vitamin D intervention studies, 44 children and adolescents with severe motor disability and 138 adults with ID were studied. After baseline blood samples, the children were divided into two groups; those in the treatment group received 1000 IU peroral vitamin D3 five days a week for 10 weeks, and subjects in the control group continued with their normal diet. Adults with ID were allocated to receive either 800 IU peroral vitamin D3 daily for six months or a single intramuscular injection of 150 000 IU D3. Blood samples were obtained at baseline and after treatment. Serum concentrations of 25-OH-vitamin D (S-25-OHD) were low in all subgroups before vitamin D intervention: in almost 60% of children and in 77% of adults the S-25-OHD concentration was below 50 nmol/L, indicating vitamin D insufficiency. After vitamin D intervention, 19% of children and 42% adults who received vitamin D perorally and 12% of adults who received vitamin D intramuscularly had optimal S-25-OHD (>80 nmol/L). This study demonstrated that low BMD and peripheral and spinal fractures are common in children with severe motor disabilities. Vitamin D status was suboptimal in the majority of children with motor disability and adults with ID. Vitamin D insufficiency can be corrected with vitamin D supplements; the peroral dose should be at least 800 IU per day.
Resumo:
Background: Both maternal and fetal complications are increased in diabetic pregnancies. Although hypertensive complications are increased in pregnant women with pregestational diabetes, reports on hypertensive complications in women with gestational diabetes mellitus (GDM) have been contradictory. Congenital malformations and macrosomia are the main fetal complications in Type 1 diabetic pregnancies, whereas fetal macrosomia and birth trauma but not congenital malformations are increased in GDM pregnancies. Aims: To study the frequency of hypertensive disorders in gestational diabetes mellitus. To evaluate the risk of macrosomia and brachial plexus injury (Erb’s palsy) and the ability of the 2-hour glucose tolerance test (OGTT) combined with the 24-hour glucose profile to distinguish between low and high risks of fetal macrosomia among women with GDM. To evaluate the relationship between glycemic control and the risk of fetal malformations in pregnancies complicated by Type 1 diabetes mellitus. To assess the effect of glycemic control on the occurrence of preeclampsia and pregnancy-induced hypertension in Type 1 diabetic pregnancies. Subjects: A total of 986 women with GDM and 203 women with borderline glucose intolerance (one abnormal value in the OGTT) with a singleton pregancy, 488 pregnant women with Type 1 diabetes (691 pregnancies and 709 offspring), and 1154 pregnant non-diabetic women (1181 pregnancies and 1187 offspring) were investigated. Results: In a prospective study on 81 GDM patients the combined frequency of preeclampsia and PIH was higher than in 327 non-diabetic controls (19.8% vs 6.1%, p<0.001). On the other hand, in 203 women with only one abnormal value in the OGTT, the rate of hypertensive complications did not differ from that of the controls. Both GDM women and those with only one abnormal value in the OGTT had higher pre-pregnancy weights and BMIs than the controls. In a retrospective study involving 385 insulin-treated and 520 diet-treated GDM patients, and 805 non-diabetic control pregnant women, fetal macrosomia occurred more often in the insulin-treated GDM pregnancies (18.2%, p<0.001) than in the diet-treated GDM pregnancies (4.4%), or the control pregnancies (2.2%). The rate of Erb’s palsy in vaginally delivered infants was 2.7% in the insulin-treated group of women and 2.4% in the diet-treated women compared with 0.3% in the controls (p<0.001). The cesarean section rate was more than twice as high (42.3% vs 18.6%) in the insulin-treated GDM patients as in the controls. A major fetal malformation was observed in 30 (4.2%) of the 709 newborn infants in Type 1 diabetic pregnancies and in 10 (1.4%) of the 735 controls (RR 3.1, 95% CI 1.6–6.2). Even women whose levels of HbA1c (normal values less than 5.6%) were only slightly increased in early pregnancy (between 5.6 and 6.8%) had a relative risk of fetal malformation of 3.0 (95% CI 1.2–7.5). Only diabetic patients with a normal HbA1c level (<5.6%) in early pregnancy had the same low risk of fetal malformations as the controls. Preeclampsia was diagnosed in 12.8% and PIH in 11.4% of the 616 Type 1 diabetic women without diabetic nephropathy. The corresponding frequencies among the 854 control women were 2.7% (OR 5.2; 95% CI 3.3–8.4) for preeclampsia and 5.6% (OR 2.2, 95% CI 1.5–3.1) for PIH. Multiple logistic regression analysis indicated that glycemic control, nulliparity, diabetic retinopathy and duration of diabetes were statistically significant independent predictors of preeclampsia. The adjusted odds ratios for preeclampsia were 1.6 (95% CI 1.3–2.0) for each 1%-unit increment in the HbA1c value during the first trimester and 0.6 (95% CI 0.5–0.8) for each 1%-unit decrement during the first half of pregnancy. In contrast, changes in glycemic control during the second half of pregnancy did not alter the risk of preeclampsia. Conclusions: In type 1 diabetic pregnancies it is extremely important to achieve optimal glycemic control before pregnancy and maintain it throughout pregnancy in order to decrease the complication rates both in the mother and in her offspring. The rate of fetal macrosomia and birth trauma in GDM pregnancies, especially in the group of insulin-treated women, is still relatively high. New strategies for screening, diagnosing, and treatment of GDM must be developed in order to decrease fetal and neonatal complications.
Resumo:
Gastric motility disorders, including delayed gastric emptying (gastroparesis), impaired postprandial fundic relaxation, and gastric myoelectrical disorders, can occur in type 1 diabetes, chronic renal failure, and functional dyspepsia (FD). Symptoms like upper abdominal pain, early satiation, bloating, nausea and vomiting may be related to gastroparesis. Diabetic gastroparesis is related to autonomic neuropathy. Scintigraphy is the gold standard in measuring gastric emptying, but it is expensive, requires specific equipment, and exposes patients to radiation. It also gives information about the intragastric distribution of the test meal. The 13C-octanoic acid breath test (OBT) is an alternative, indirect method of measuring gastric emptying with a stable isotope. Electrogastrography (EGG) registers the slow wave originating in the pacemaker area of the stomach and regulating the peristaltic contractions of the antrum. This study compares these three methods of measuring gastric motility in patients with type 1 diabetes, functional dyspepsia, and chronic renal failure. Currently no effective drugs for treating gastric motility disorders are available. We studied the effect of nizatidine on gastric emptying, because in preliminary studies this drug has proven to have a prokinetic effect due to its cholinergic properties. Of the type 1 patients, 26% had delayed gastric emptying of solids as measured by scintigraphy. Abnormal intragastric distribution of the test meal occurred in 37% of the patients, indicating impaired fundic relaxation. The autonomic neuropathy score correlated positively with the gastric emptying rate of solids (P = 0.006), but HbA1C, plasma glucose levels, or abdominal symptoms were unrelated to gastric emptying or intragastric distribution of the test meal. Gastric emptying of both solids and liquids was normal in all FD patients but abnormal intragastric distribution occurred in 38% of the patients. Nizatidine improved symptom scores and quality of life in FD patients, but not significantly. Instead of enhancing, nizatidine slowed gastric emptying in FD patients (P < 0.05). No significant difference appeared in the frequency of the gastric slow waves measured by EGG in the patients and controls. The correlation between gastric half-emptying times of solids measured by scintigraphy and OBT was poor both in type 1 diabetes and FD patients. According to this study, dynamic dual-tracer scintigraphy is more accurate than OBT or EGG in measuring gastric emptying of solids. Additionally it provides information about gastric emptying of liquids and the intragastric distribution of the ingested test meal.
Resumo:
Infectious diseases put an enormous burden on both children and the elderly in the forms of respiratory, gastrointestinal and oral infections. There is evidence suggesting that specific probiotics may be antagonistic to pathogens and may enhance the immune system, but the clinical evidence is still too sparce to make general conclusions on the disease-preventive effects of probiotics. This thesis, consisting of four independent, double-blind, placebo-controlled clinical trials, investigated whether Lactobacillus GG (LGG) or a specific probiotic combination containing LGG would reduce the risk of common infections or the prevalence of pathogens in healthy and infection-prone children and in independent and institutionalised elderly people. In healthy day-care children, the 7-month consumption of probiotic milk containing Lactobacillus GG appeared to postpone the first acute respiratory infection (ARI) by one week (p=0.03, adjusted p=0.16), and to reduce complicated infections (39% vs. 47%, p<0.05, adjusted p=0.13), as well as the need for antibiotic treatment (44% vs. 54%, p=0.03, adjusted p=0.08) and day-care absences (4.9 vs. 5.8 days, p=0.03, adjusted p=0.09) compared to the placebo milk. In infection-prone children, the 6-month consumption of a combination of four probiotic bacteria (LGG, L. rhamnosus LC705, Propionibacterium freudenreichii JS, Bifidobacterium breve 99) taken in capsules appeared to reduce recurrent ARIs (72% vs. 82%, p<0.05; adjusted p=0.06), and the effect was particularly noticeable in a subgroup of children with allergic diseases (12% vs. 33%, p=0.03), although no effect on the presence of nasopharyngeal rhinovirus or enterovirus was seen. The 5-month consumption of the same probiotic combination did not show any beneficial effects on the respiratory infections in frail, institutionalised elderly subjects. In healthy children receiving Lactobacillus GG, the reduction in complications resulted in a marginal reduction in the occurrence of acute otitis media (AOM) (31% vs. 39%, p=0.08; adjusted p=0.19), and the postponement of the first AOM episode by 12 days (p=0.04; adjusted p=0.09). However, in otitis-prone children, a probiotic combination did not reduce the occurrence of AOM or the total prevalence of common AOM pathogens (Streptococcus pneumoniae, Haemophilus influenzae, Moraxella catarrhalis), except in the case of children with allergic diseases, in whom probiotics reduced recurrent AOM episodes (0% vs. 14%, p=0.03). In addition, interaction between probiotics and bacterial carriage was seen: probiot-ics reduced AOM in children who did not carry any bacterial pathogens (63% vs. 83%), but the effect was the reverse in children carrying bacteria in the nasopharynx (74% vs 62%) (p<0.05). Long-term probiotic treatment, either LGG given in milk to healthy children for 7 months or a combination of probiotics given in capsules to institutionalised elderly subjects for 5 months, did not reduce the occurrence of acute diarrhoea. However, when the probiotic combination (LGG, L. rhamnosus LC705, Propionibacterium JS) was given in cheese to independent elderly subjects for 4 months, the oral carriage of high Candida counts was reduced in the probiotic group vs. the placebo group (21% vs. 34%, p=0.01, adjusted p=0.004). The risk of hyposalivation was also reduced in the probiotic group (p=0.05). In conclusion, probiotics appear to slightly alleviate the severity of infections by postponing their appearance, by reducing complications and the need for antimicrobial treatments. In addition, they appear to prevent recurrent infections in certain subgroups of children, such as in infection-prone children with allergic diseases. Alleviating ARI by probiotics may lead to a marginal reduction in the occurrence of AOM in healthy children but not in infection-prone children with disturbed nasopharyngeal microbiota. On the basis of these results it could be supposed that Lactobacillus GG or a specific combination containing LGG are effective against viral but not against bacterial otitis, and the mechanism is probably mediated through the stimulation of the immune system. A specific probiotic combination does not reduce respiratory infections in frail elderly subjects. Acute diarrhoea, either in children or in the elderly, is not prevented by the continuous, long-term consumption of probiotics, but the consumption of a specific probiotic combination in a food matrix is beneficial to the oral health of the elderly, through the reduction of the carriage of Candida.
Resumo:
In recent reports, adolescents and young adults (AYA) with acute lymphoblastic leukemia (ALL) have had a better outcome with pediatric treatment than with adult protocols. ALL can be classified into biologic subgroups according to immunophenotype and cytogenetics, with different clinical characteristics and outcome. The proportions of the subgroups are different in children and adults. ALL subtypes in AYA patients are less well characterized. In this study, the treatment and outcome of ALL in AYA patients aged 10-25 years in Finland on pediatric and adult protocols was retrospectively analyzed. In total, 245 patients were included. The proportions of biologic subgroups in different age groups were determined. Patients with initially normal or failed karyotype were examined with oligonucleotide microarray-based comparative genomic hybridization (aCGH). Also deletions and instability of chromosome 9p were screened in ALL patients. In addition, patients with other hematologic malignancies were screened for 9p instability. aCGH data were also used to determine a gene set that classifies AYA patients at diagnosis according to their risk of relapse. Receiver operating characteristic analysis was used to assess the value of the set of genes as prognostic classifiers. The 5-year event-free survival of AYA patients treated with pediatric or adult protocols was 67% and 60% (p=0.30), respectively. White blood cell count larger than 100x109/l was associated with poor prognosis. Patients treated with pediatric protocols and assigned to an intermediate-risk group fared significantly better than those of the pediatric high-risk or adult treatment groups. Deletions of 9p were detected in 46% of AYA ALL patients. The chromosomal region 9p21.3 was always affected, and the CDKN2A gene was always deleted. In about 15% of AYA patients, the 9p21.3 deletion was smaller than 200 kb in size, and therefore, probably undetectable with conventional methods. Deletion of 9p was the most common aberration of AYA ALL patients with initially normal karyotype. Instability of 9p, defined as multiple separate areas of copy number loss or homozygous loss within a larger heterozygous area in 9p, was detected in 19% (n=27) of ALL patients. This abnormality was restricted to ALL; none of the patients with other hematologic malignancies had the aberration. The prognostic model identification procedure resulted in a model of four genes: BAK1, CDKN2B, GSTM1, and MT1F. The copy number profile combinations of these genes differentiated between AYA ALL patients at diagnosis depending on their risk of relapse. Deletions of CDKN2B and BAK1 in combination with amplification of GSTM1 and MT1F were associated with a higher probability of relapse. Unlike all previous studies, we found that the outcome of AYA patients with ALL treated using pediatric or adult therapeutic protocols was comparable. The success of adult ALL therapy emphasizes the benefit of referral of patients to academic centers and adherence to research protocols. 9p deletions and instability are common features of ALL and may act together with oncogene-activating translocations in leukemogenesis. New and more sensitive methods of molecular cytogenetics can reveal previously cryptic genetic aberrations with an important role in leukemic development and prognosis and that may be potential targets of therapy. aCGH also provides a viable approach for model design aiming at evaluation of risk of relapse in ALL.
Resumo:
Background: Brachial plexus birth palsy (BPBP) most often occurs as a result of foetal-maternal disproportion. The C5 and C6 nerve roots of the brachial plexus are most frequently affected. In contrast, roots from the C7 to Th1 that result in total injury together with C5 and C6 injury, are affected in fewer than half of the patients. BPBP was first described by Smellie in 1764. Erb published his classical description of the injury in 1874 and his name became linked with the paralysis that is associated with upper root injury. Since then, early results of brachial plexus surgery have been reasonably well documented. However, from a clinical point of view not all primary results are maintained and there is also a need for later follow-up results. In addition most of the studies that are published emanate from highly specialized clinics and no nation wide epidemiological reports are available. One of the plexus injuries is the avulsion type, in which the nerve root or roots are ruptured at the neural cord. It has been speculated whether this might cause injury to the whole neural system or whether shoulder asymmetry and upper limb inequality results in postural deformities of the spine. Alternatively, avulsion could manifest as other signs and symptoms of the whole musculoskeletal system. In addition, there is no available information covering activities of daily living after obstetric brachial plexus surgery. Patients and methods: This was a population-based cross-sectional study on all patients who had undergone brachial plexus surgery with at least 5 years of follow-up. An incidence of 3.05/1000 for BPBP was obtained from the registers for this study period. A total of 1706 BPBP patients needing hospital treatment out of 1 717 057 newborns were registered in Finland between 1971 and 1997 inclusive. Of these BPBP patients, 124 (7.3%) underwent brachial plexus surgery at a mean age of 2.8 months (range: 0.4―13.2 months). Surgery was most often performed by direct neuroraphy after neuroma resection (53%). Depending on the phase of the study, 105 to 112 patients (85-90%) participated in a clinical and radiological follow-up assessment. The mean follow up time exceeded 13 years (range: 5.0―31.5 years). Functional status of the upper extremity was evaluated using Mallet, Gilbert and Raimondi scales. Isometric strength of the upper limb, sensation of the hand and stereognosis were evaluated for both the affected and unaffected sides then the differences and their ratios were calculated and recorded. In addition to the upper extremity, assessment of the spine and lower extremities were performed. Activities of daily living (ADL), participation in normal physical activities, and the use of physiotherapy and occupational therapy were recorded in a questionnaire. Results: The unaffected limb functioned as the dominant hand in all, except four patients. The mean length of the affected upper limb was 6 cm (range: 1-13.5 cm) shorter in 106 (95%) patients. Shoulder function was recorded as a mean Mallet score of 3 (range: 2―4) which was moderate. Both elbow function and hand function were good. The mean Gilbert elbow scale value was 3 (range: -1―5) and the mean Raimondi hand scale was 4 (range:1―5). One-third of the patients experienced pain in the affected limb including all those patients (n=9) who had clavicular non-union resulting from surgery. A total of 61 patients (57%) had an active shoulder external rotation of less than 0° and an active elbow extension deficiency was noted in 82 patients (77%) giving a mean of 26° (range: 5°―80°). In all, expect two patients, shoulder external rotation strength at a mean ratio 35% (range: 0―83%) and in all patients elbow flexion strength at a mean ratio of 41% (range: 0―79%) were impaired compared to the unaffected side. According to radiographs, incongruence of the glenohumeral joint was noted in 15 (16%) patients, whereas incongruence of the radiohumeral joint was found in 20 (21%) patients. Fine sensation was normal for 34/49 (69%) patients with C5-6 injury, for 15/31 (48%) with C5-7 and for only 8/25 (32%) of patients with total injury. Loss of protective sensation or absent sensation was noted in some palmar areas of the hand for 12/105 patients (11%). Normal stereognosis was recorded for 88/105 patients (84%). No significant inequalities in leg length were found and the incidence of structural scoliosis (1.7%) did not differ from that of the reference population. Nearly half of the patients (43%) had asynchronous motion of the upper limbs during gait, which was associated with impaired upper limb function. Data obtained from the completed questionnaires indicated that two thirds (63%) of the patients were satisfied with the functional outcome of the affected hand although one third of all patients needed help with ADL. Only a few patients were unable to participate in physical activities such as: bicycling, cross-country skiing or swimming. However, 71% of the patients reported problems related to the affected upper limb, such as muscle weakness and/or joint stiffness during the aforementioned activities. Incongruity of the radiohumeral joints, extent of the injury, avulsion type injury, age less than three months of age at the time of plexus surgery and inexperience of the surgeon was related to poor results as determined by multivariate analyses. Conclusions: Most of the patients had persistent sequelae, especially of shoulder function. Almost all measurements for the total injury group were poorer compared with those of the C5-6 type injury group. Most of the patients had asymmetry of the shoulder region and a shorter affected upper limb, which is a probable reason for having an abnormal gait. However, BPBP did not have an effect on normal growth of the lower extremities or the spine. Although, participation in physical activities was similar to that of the normal population, two-thirds of the patients reported problems. One-third of the patients needed help with ADL. During the period covered by this study, 7.3% BPBP of patients that needed hospital treatment had a brachial plexus operation, which amounts to fewer than 10 operations per year in Finland. It seems that better results of obstetric plexus surgery and more careful follow-up including opportunities for late reconstructive procedures will be expected, if the treatment is solely concentrated on by a few specialised teams.
Resumo:
Germ cell tumors occur both in the gonads of both sexes and in extra-gonadal sites during adoles-cence and early adulthood. Malignant ovarian germ cell tumors are rare neoplasms accounting for less than 5% of all cases of ovarian malignancy. In contrast, testicular cancer is the most common malignancy among young males. Most of patients survive the disease. Prognostic factors of gonadal germ cell tumors include histology, clinical stage, size of the primary tumor and residua, and levels of tumor markers. Germ cell tumors include heterogeneous histological subgroups. The most common subgroup includes germinomas (ovarian dysgerminoma and testicular seminoma); other subgroups are yolk sac tumors, embryonal carcinomas, immature teratomas and mixed tumors. The origin of germ cell tumors is most likely primordial germ cells. Factors behind germ cell tumor development and differentiation are still poorly known. The purpose of this study was to define novel diagnostic and prognostic factors for malignant gonadal germ cell tumors. In addition, the aim was to shed further light into the molecular mechanisms regulating gonadal germ cell tumorigenesis and differentiation by studying the roles of GATA transcription factors, pluripotent factors Oct-3/4 and AP-2γ, and estrogen receptors. This study revealed the prognostic value of CA-125 in malignant ovarian germ cell tumors. In addition advanced age and residual tumor had more adverse outcome. Several novel markers for histological diagnosis were defined. In the fetal development transcription factor GATA-4 was expressed in early fetal gonocytes and in testicular carcinoma precursor cells. In addition, GATA-4 was expressed in both gonadal germinomas, thus it may play a role in the development and differentiation of the germinoma tumor subtype. Pluripotent factors Oct-3/4 and AP-2γ were expressed in dysgerminomas, thus they could be used in the differential diagnosis of the germ cell tumors. Malignant ovarian germ cell tumors expressed estrogen receptors and their co-regulator SNURF. In addition, estrogen receptor expression was up-regulated by estradiol stimulation. Thus, gonadal steroid hormone burst in puberty may play a role in germ cell tumor development in the ovary. This study shed further light in to the molecular pathology of malignant gonadal germ cell tumors. In addition, some novel diagnostic and prognostic factors were defined. This data may be used in the differential diagnosis of germ cell tumor patients.