849 resultados para Sophie Bessis
Resumo:
The complex relation between thrombotic thrombocytopenic purpura (TTP) and pregnancy is concisely reviewed. Pregnancy is a very strong trigger for acute disease manifestation in patients with hereditary TTP caused by double heterozygous or homozygous mutations of ADAMTS13 (ADisintegrin And Metalloprotease with ThromboSpondin type 1 domains, no. 13). In several affected women disease onset during their first pregnancy leads to the diagnosis of hereditary TTP. Without plasma treatment mother and especially fetus are at high risk of dying. The relapse risk during a next pregnancy is almost 100% but regular plasma transfusion starting in early pregnancy will prevent acute TTP flare-up and may result in successful pregnancy outcome. Pregnancy may also constitute a mild risk factor for the onset of acute acquired TTP caused by autoantibody-mediated severe ADAMTS13 deficiency. Women having survived acute acquired TTP may not be at very high risk of TTP relapse during an ensuing next pregnancy but seem to have an elevated risk of preeclampsia. Monitoring of ADAMTS13 activity and inhibitor titre during pregnancy may help to guide management and to avoid disease recurrence. Finally, TTP needs to be distinguished from the much more frequent hypertensive pregnancy complications, preeclampsia and especially HELLP (Hemolysis, Elevated Liver Enzymes, Low Platelet count) syndrome.
Resumo:
The congenital form of thrombotic thrombocytopenic purpura (TTP) is caused by genetic mutations in ADAMTS13. Some, but not all, congenital TTP patients manifest renal insufficiency in addition to microangiopathic hemolysis and thrombocytopenia. We included 32 congenital TTP patients in the present study, which was designed to assess whether congenital TTP patients with renal insufficiency have predisposing mutations in complement regulatory genes, as found in many patients with atypical hemolytic uremic syndrome (aHUS). In 13 patients with severe renal insufficiency, six candidate complement or complement regulatory genes were sequenced and 11 missense mutations were identified. One of these missense mutations, C3:p.K155Q mutation, is a rare mutation located in the macroglobulin-like 2 domain of C3, where other mutations predisposing for aHUS cluster. Several of the common missense mutations identified in our study have been reported to increase disease-risk for aHUS, but were not more common in patients with as compared to those without renal insufficiency. Taken together, our results show that the majority of the congenital TTP patients with renal insufficiency studied do not carry rare genetic mutations in complement or complement regulatory genes.
Resumo:
BACKGROUND/AIMS Controversies still exist regarding the evaluation of growth hormone deficiency (GHD) in childhood at the end of growth. The aim of this study was to describe the natural history of GHD in a pediatric cohort. METHODS This is a retrospective study of a cohort of pediatric patients with GHD. Cases of acquired GHD were excluded. Univariate logistic regression was used to identify predictors of GHD persisting into adulthood. RESULTS Among 63 identified patients, 47 (75%) had partial GHD at diagnosis, while 16 (25%) had complete GHD, including 5 with multiple pituitary hormone deficiencies. At final height, 50 patients underwent repeat stimulation testing; 28 (56%) recovered and 22 (44%) remained growth hormone (GH) deficient. Predictors of persisting GHD were: complete GHD at diagnosis (OR 10.1, 95% CI 2.4-42.1), pituitary stalk defect or ectopic pituitary gland on magnetic resonance imaging (OR 6.5, 95% CI 1.1-37.1), greater height gain during GH treatment (OR 1.8, 95% CI 1.0-3.3), and IGF-1 level <-2 standard deviation scores (SDS) following treatment cessation (OR 19.3, 95% CI 3.6-103.1). In the multivariate analysis, only IGF-1 level <-2 SDS (OR 13.3, 95% CI 2.3-77.3) and complete GHD (OR 6.3, 95% CI 1.2-32.8) were associated with the outcome. CONCLUSION At final height, 56% of adolescents with GHD had recovered. Complete GHD at diagnosis, low IGF-1 levels following retesting, and pituitary malformation were strong predictors of persistence of GHD.
Resumo:
Für Menschen mit Behinderungen und ihr Umfeld ist es entscheidend zu wissen, welche Parteien sie mit ihrer Arbeit im Parlament unterstützen. In den Fragen rund um Behinderungen zeigt sich unter der Bundeshauskuppel tendenziell ein Rechts-Links-Graben. Die Mitte-Parteien, insbesondere die CVP, scheinen diese Tatsache allerdings in Frage zu stellen.
Resumo:
Prematurity is the most common disruptor of lung development. The aim of our study was to examine the function of the more vulnerable peripheral airways in former preterm children by multiple-breath washout (MBW) measurements.86 school-aged children, born between 24 and 35 weeks of gestation and 49 term-born children performed nitrogen MBW. Lung clearance index (LCI), and slope III-derived Scond and Sacin were assessed as markers for global, convection-dependent and diffusion-convection-dependent ventilation inhomogeneity, respectively.We analysed the data of 77 former preterm (mean (range) age 9.5 (7.2-12.8) years) and 46 term-born children (mean age 9.9 (6.0-15.9) years). LCI and Sacin did not differ between preterm and term-born children. Scond was significantly elevated in preterm compared to term-born participants (mean difference z-score 1.74, 95% CI 1.17-2.30; p<0.001), with 54% of former preterm children showing elevated Scond. In multivariable regression analysis Scond was significantly related only to gestational age (R(2)=0.37).Normal Sacin provides evidence for a functionally normal alveolar compartment, while elevated Scond indicates impaired function of more proximal conducting airways. Together, our findings support the concept of continued alveolarisation, albeit with "dysanaptic" lung growth in former preterm children.
Resumo:
BACKGROUND Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM Our aim was to challenge the validity of these software algorithms. METHODS We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes.
Resumo:
BACKGROUND AND OBJECTIVES Multiple-breath washout (MBW) is an attractive test to assess ventilation inhomogeneity, a marker of peripheral lung disease. Standardization of MBW is hampered as little data exists on possible measurement bias. We aimed to identify potential sources of measurement bias based on MBW software settings. METHODS We used unprocessed data from nitrogen (N2) MBW (Exhalyzer D, Eco Medics AG) applied in 30 children aged 5-18 years: 10 with CF, 10 formerly preterm, and 10 healthy controls. This setup calculates the tracer gas N2 mainly from measured O2 and CO2concentrations. The following software settings for MBW signal processing were changed by at least 5 units or >10% in both directions or completely switched off: (i) environmental conditions, (ii) apparatus dead space, (iii) O2 and CO2 signal correction, and (iv) signal alignment (delay time). Primary outcome was the change in lung clearance index (LCI) compared to LCI calculated with the settings as recommended. A change in LCI exceeding 10% was considered relevant. RESULTS Changes in both environmental and dead space settings resulted in uniform but modest LCI changes and exceeded >10% in only two measurements. Changes in signal alignment and O2 signal correction had the most relevant impact on LCI. Decrease of O2 delay time by 40 ms (7%) lead to a mean LCI increase of 12%, with >10% LCI change in 60% of the children. Increase of O2 delay time by 40 ms resulted in mean LCI decrease of 9% with LCI changing >10% in 43% of the children. CONCLUSIONS Accurate LCI results depend crucially on signal processing settings in MBW software. Especially correct signal delay times are possible sources of incorrect LCI measurements. Algorithms of signal processing and signal alignment should thus be optimized to avoid susceptibility of MBW measurements to this significant measurement bias.
Resumo:
Pulmonary exacerbations are important clinical events for cystic fibrosis (CF) patients. Studies assessing the ability of the lung clearance index (LCI) to detect treatment response for pulmonary exacerbations have yielded heterogeneous results. Here, we conduct a retrospective analysis of pooled LCI data to assess treatment with intravenous antibiotics for pulmonary exacerbations and to understand factors explaining the heterogeneous response.A systematic literature search was performed to identify prospective observational studies. Factors predicting the relative change in LCI and spirometry were evaluated while adjusting for within-study clustering.Six previously reported studies and one unpublished study, which included 176 pulmonary exacerbations in both paediatric and adult patients, were included. Overall, LCI significantly decreased by 0.40 units (95% CI -0.60- -0.19, p=0.004) or 2.5% following treatment. The relative change in LCI was significantly correlated with the relative change in forced expiratory volume in 1 s (FEV1), but results were discordant in 42.5% of subjects (80 out of 188). Higher (worse) baseline LCI was associated with a greater improvement in LCI (slope: -0.9%, 95% CI -1.0- -0.4%).LCI response to therapy for pulmonary exacerbations is heterogeneous in CF patients; the overall effect size is small and results are often discordant with FEV1.
Resumo:
Sarcoptic mange occurs in free-ranging wild boar (Sus scrofa) but has been poorly described in this species. We evaluated the performance of a commercial indirect enzyme-linked immunosorbent assay (ELISA) for serodiagnosis of sarcoptic mange in domestic swine when applied to wild boar sera. We tested 96 sera from wild boar in populations without mange history ("truly noninfected") collected in Switzerland between December 2012 and February 2014, and 141 sera from free-ranging wild boar presenting mange-like lesions, including 50 live animals captured and sampled multiple times in France between May and August 2006 and three cases submitted to necropsy in Switzerland between April 2010 and February 2014. Mite infestation was confirmed by skin scraping in 20 of them ("truly infected"). We defined sensitivity of the test as the proportion of truly infected that were found ELISA-positive, and specificity as the proportion of truly noninfected that were found negative. Sensitivity and specificity were 75% and 80%, respectively. Success of antibody detection increased with the chronicity of lesions, and seroconversion was documented in 19 of 27 wild boar sampled multiple times that were initially negative or doubtful. In conclusion, the evaluated ELISA has been successfully applied to wild boar sera. It appears to be unreliable for early detection in individual animals but may represent a useful tool for population surveys.
Resumo:
Most of the phyllosilicates detected at the surface of Mars today are probably remnants of ancient environments that sustained long-term bodies of liquid water at the surface or subsurface and were possibly favorable for the emergence of life. Consequently, phyllosilicates have become the main mineral target in the search for organics on Mars. But are phyllosilicates efficient at preserving organic molecules under current environmental conditions at the surface of Mars? We monitored the qualitative and quantitative evolutions of glycine, urea, and adenine in interaction with the Fe3+-smectite clay nontronite, one of the most abundant phyllosilicates present at the surface of Mars, under simulated martian surface ultraviolet light (190-400 nm), mean temperature (218 +/- 2 K), and pressure (6 +/- 1 mbar) in a laboratory simulation setup. We tested organic-rich samples that were representative of the evaporation of a small, warm pond of liquid water containing a high concentration of organics. For each molecule, we observed how the nontronite influences its quantum efficiency of photodecomposition and the nature of its solid evolution products. The results reveal a pronounced photoprotective effect of nontronite on the evolution of glycine and adenine; their efficiencies of photodecomposition were reduced by a factor of 5 when mixed at a concentration of 2.6x10(-2) mol of molecules per gram of nontronite. Moreover, when the amount of nontronite in the sample of glycine was increased by a factor of 2, the gain of photoprotection was multiplied by a factor of 5. This indicates that the photoprotection provided by the nontronite is not a purely mechanical shielding effect but is also due to stabilizing interactions. No new evolution product was firmly identified, but the results obtained with urea suggest a particular reactivity in the presence of nontronite, leading to an increase of its dissociation rate. Key Words: Martian surface-Organic chemistry-Photochemistry-Astrochemistry-Nontronite-Phyllosilicates. Astrobiology 15, 221-237.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.