1000 resultados para Menstrual related migraine (MRM)
Resumo:
PURPOSE: To investigate the rhythm and predictability of the need for retreatment with intravitreal injections of ranibizumab for neovascular age-related macular degeneration (nAMD). METHODS: This prospective study enrolled 39 patients with treatment-naïve nAMD. After three loading doses of intravitreal ranibizumab, patients underwent an intensified follow-up for 12 months (initially weekly, then with stepwise increases to every 2 weeks and to monthly after each injection). Patients were retreated on an as-needed basis if any fluid or increased central retinal thickness (CRT) (>50 μm) was found on spectral domain optical coherence tomography (OCT). Statistical analysis included patients who received at least two retreatments (five injections). RESULTS: A mean of 7.5 injections (range 0-12) were given between months 3 and 15. The mean visual acuity increased by 13.1 and 12.6 ETDRS letters at months 12 and 15 respectively. Two or more injection-retreatment intervals were found in 31 patients. The variability of their intra-individual intervals up to 14 weeks was small (SD 0-2.13 weeks), revealing a high regularity of the retreatment rhythm. The SD was correlated with the mean interval duration (r = 0.89, p < 0.001). The first interval was a good predictor of the following intervals (regression coefficient =0.81). One retreatment criterion was stable in 97 % of patients (cysts or subretinal fluid). CONCLUSION: The results of this study demonstrate a high intra-individual predictability of retreatment need with ranibizumab injections for nAMD. These findings may be helpful for developing individualized treatment plans for maintained suppression of disease activity with a minimum of injections and visits.
Resumo:
Cavernomas after radiotherapy, developing in irradiated children treated for malignant brain tumors, are capillary malformations that are frquently asymptomatic and benign in their evolution. However, in some children this can lead to haemorrhage, which can cause symptoms and need a surgical intervention. Although there is increasing evidence of cavernoma as a possible long term sequelae after radiotherapy, there is still information needed concerning very long follow-up. Different groups studied this problem focusing on incidence and the lag time radiotherapy and the appearance of cavernomas. Results showed that the period can last a long time and the cumulative incidence increases over the years, but the numbers vary between the different publications. More recently researchers tried to compare several predictive factors with the incidence of cavernomas, such as age at radiotherapy, gender, kind of cancer and chemotherapy. No relation has been recorded except a growing incidence when the radiotherapy was started before the age of ten. Reason of the study : The observations reported until now comprised a very heterogenous cohort of patients. No study has ever been made with patients affected only by malignant brain tumors which are typical in a children. As for the studied predictive factors, no publication described the technical aspect of radiotherapy. Objectives: To study a population of pediatric patients children with only malignant brain tumors in order tp calculate the incidence of cavernomas after radiotherapy and their evolution over a longer period compared to so far published researches. To analyse known predictive factors such as age of children at the moment of the radiotherapy, gender, and kind of cancer. To study extensively the role technical aspects of radiotherapy in the occurrence of cavernomas. Methodology: Retrospective study of a group of 62 children irradiated at the CHUV (Lausanne, Switzerland) between 1975 and 2010 due to the following malignant brain cancers: medulloblastoma, ependymoma, PNET. The images of IRM post radiotherapy will be analysed by a neuroradiologist and a radiotherapist will interpret the radiotherapy data. Expected results: We expect to find relations between the incidence of cavernomas post radiotherapy and the predictive factors including different techniques of radiotherapy and consequently to define the best long-term follow up of the children at risk.
Resumo:
Site-specific regression coefficient values are essential for erosion prediction with empirical models. With the objective to investigate the surface-soilconsolidation factor, Cf, linked to the RUSLE's prior-land-use subfactor, PLU, an erosion experiment using simulated rainfall on a 0.075 m m-1 slope, sandy loam Paleudult soil, was conducted at the Agriculture Experimental Station of the Federal University of Rio Grande do Sul (EEA/UFRGS), in Eldorado do Sul, State of Rio Grande do Sul, Brazil. Firstly, a row-cropped area was excluded from cultivation (March 1995), the existing crop residue removed from the field, and the soil kept clean-tilled the rest of the year (to get a degraded soil condition for the intended purpose of this research). The soil was then conventional-tilled for the last time (except for a standard plot which was kept continuously cleantilled for comparison purposes), in January 1996, and the following treatments were established and evaluated for soil reconsolidation and soil erosion until May 1998, on duplicated 3.5 x 11.0 m erosion plots: (a) fresh-tilled soil, continuously in clean-tilled fallow (unit plot); (b) reconsolidating soil without cultivation; and (c) reconsolidating soil with cultivation (a crop sequence of three corn- and two black oats cycles, continuously in no-till, removing the crop residues after each harvest for rainfall application and redistributing them on the site after that). Simulated rainfall was applied with a Swanson's type, rotating-boom rainfall simulator, at 63.5 mm h-1 intensity and 90 min duration, six times during the two-and-half years of experimental period (at the beginning of the study and after each crop harvest, with the soil in the unit plot being retilled before each rainfall test). The soil-surface-consolidation factor, Cf, was calculated by dividing soil loss values from the reconsolidating soil treatments by the average value from the fresh-tilled soil treatment (unit plot). Non-linear regression was used to fit the Cf = e b.t model through the calculated Cf-data, where t is time in days since last tillage. Values for b were -0.0020 for the reconsolidating soil without cultivation and -0.0031 for the one with cultivation, yielding Cf-values equal to 0.16 and 0.06, respectively, after two-and-half years of tillage discontinuation, compared to 1.0 for fresh-tilled soil. These estimated Cf-values correspond, respectively, to soil loss reductions of 84 and 94 %, in relation to soil loss from the fresh-tilled soil, showing that the soil surface reconsolidated intenser with cultivation than without it. Two distinct treatmentinherent soil surface conditions probably influenced the rapid decay-rate of Cf values in this study, but, as a matter of a fact, they were part of the real environmental field conditions. Cf-factor curves presented in this paper are therefore useful for predicting erosion with RUSLE, but their application is restricted to situations where both soil type and particular soil surface condition are similar to the ones investigate in this study.
Resumo:
Erosion is deleterious because it reduces the soil's productivity capacity for growing crops and causes sedimentation and water pollution problems. Surface and buried crop residue, as well as live and dead plant roots, play an important role in erosion control. An efficient way to assess the effectiveness of such materials in erosion reduction is by means of decomposition constants as used within the Revised Universal Soil Loss Equation - RUSLE's prior-land-use subfactor - PLU. This was investigated using simulated rainfall on a 0.12 m m-1 slope, sandy loam Paleudult soil, at the Agriculture Experimental Station of the Federal University of Rio Grande do Sul, in Eldorado do Sul, State of Rio Grande do Sul, Brazil. The study area had been covered by native grass pasture for about fifteen years. By the middle of March 1996, the sod was mechanically mowed and the crop residue removed from the field. Late in April 1996, the sod was chemically desiccated with herbicide and, about one month later, the following treatments were established and evaluated for sod biomass decomposition and soil erosion, from June 1996 to May 1998, on duplicated 3.5 x 11.0 m erosion plots: (a) and (b) soil without tillage, with surface residue and dead roots; (c) soil without tillage, with dead roots only; (d) soil tilled conventionally every two-and-half months, with dead roots plus incorporated residue; and (e) soil tilled conventionally every six months, with dead roots plus incorporated residue. Simulated rainfall was applied with a rotating-boom rainfall simulator, at an intensity of 63.5 mm h-1 for 90 min, eight to nine times during the experimental period (about every two-and-half months). Surface and subsurface sod biomass amounts were measured before each rainfall test along with the erosion measurements of runoff rate, sediment concentration in runoff, soil loss rate, and total soil loss. Non-linear regression analysis was performed using an exponential and a power model. Surface sod biomass decomposition was better depicted by the exponential model, while subsurface sod biomass was by the power model. Subsurface sod biomass decomposed faster and more than surface sod biomass, with dead roots in untilled soil without residue on the surface decomposing more than dead roots in untilled soil with surface residue. Tillage type and frequency did not appreciably influence subsurface sod biomass decomposition. Soil loss rates increased greatly with both surface sod biomass decomposition and decomposition of subsurface sod biomass in the conventionally tilled soil, but they were minimally affected by subsurface sod biomass decomposition in the untilled soil. Runoff rates were little affected by the studied treatments. Dead roots plus incorporated residues were effective in reducing erosion in the conventionally tilled soil, while consolidation of the soil surface was important in no-till. The residual effect of the turned soil on erosion diminished gradually with time and ceased after two years.
Resumo:
OBJECTIVE: To evaluate the relative efficacy and safety profile of bevacizumab versus ranibizumab intravitreal injections for the treatment of neovascular age-related macular degeneration (AMD). DESIGN: Multicenter, prospective, noninferiority, double-masked, randomized clinical trial performed in 38 French ophthalmology centers. The noninferiority limit was 5 letters. PARTICIPANTS: Patients aged ≥50 years were eligible if they presented with subfoveal neovascular AMD, with best-corrected visual acuity (BVCA) in the study eye of between 20/32 and 20/320 measured on the Early Treatment of Diabetic Retinopathy Study chart and a lesion area of less than 12 optic disc areas (DA). METHODS: Patients were randomly assigned to intravitreal administration of bevacizumab (1.25 mg) or ranibizumab (0.50 mg). Hospital pharmacies were responsible for preparing, blinding, and dispensing treatments. Patients were followed for 1 year, with a loading dose of 3 monthly intravitreal injections, followed by an as-needed regimen (1 injection in case of active disease) for the remaining 9 months with monthly follow-up. MAIN OUTCOME MEASURES: Mean change in visual acuity at 1 year. RESULTS: Between June 2009 and November 2011, 501 patients were randomized. In the per protocol analysis, bevacizumab was noninferior to ranibizumab (bevacizumab minus ranibizumab +1.89 letters; 95% confidence interval [CI], -1.16 to +4.93, P < 0.0001). The intention-to-treat analysis was concordant. The mean number of injections was 6.8 in the bevacizumab group and 6.5 in the ranibizumab group (P = 0.39). Both drugs reduced the central subfield macular thickness, with a mean decrease of 95 μm for bevacizumab and 107 μm for ranibizumab (P = 0.27). There were no significant differences in the presence of subretinal or intraretinal fluid at final evaluation, dye leakage on angiogram, or change in choroidal neovascular area. The proportion of patients with serious adverse events was 12.6% in the bevacizumab group and 12.1% in the ranibizumab group (P = 0.88). The proportion of patients with serious systemic or ocular adverse events was similar in both groups. CONCLUSIONS: Bevacizumab was noninferior to ranibizumab for visual acuity at 1 year with similar safety profiles. Ranibizumab tended to have a better anatomic outcome. The results are similar to those of previous head-to-head studies. FINANCIAL DISCLOSURE(S): Proprietary or commercial disclosure may be found after the references.
Resumo:
Neuroimaging studies typically compare experimental conditions using average brain responses, thereby overlooking the stimulus-related information conveyed by distributed spatio-temporal patterns of single-trial responses. Here, we take advantage of this rich information at a single-trial level to decode stimulus-related signals in two event-related potential (ERP) studies. Our method models the statistical distribution of the voltage topographies with a Gaussian Mixture Model (GMM), which reduces the dataset to a number of representative voltage topographies. The degree of presence of these topographies across trials at specific latencies is then used to classify experimental conditions. We tested the algorithm using a cross-validation procedure in two independent EEG datasets. In the first ERP study, we classified left- versus right-hemifield checkerboard stimuli for upper and lower visual hemifields. In a second ERP study, when functional differences cannot be assumed, we classified initial versus repeated presentations of visual objects. With minimal a priori information, the GMM model provides neurophysiologically interpretable features - vis à vis voltage topographies - as well as dynamic information about brain function. This method can in principle be applied to any ERP dataset testing the functional relevance of specific time periods for stimulus processing, the predictability of subject's behavior and cognitive states, and the discrimination between healthy and clinical populations.
Resumo:
OBJECTIVE: To identify pregnancy-related risk factors for different manifestations of congenital anorectal malformations (ARMs). DESIGN: A population-based case-control study. SETTING: Seventeen EUROCAT (European Surveillance of Congenital Anomalies) registries, 1980-2008. POPULATION: The study population consisted of 1417 cases with ARM, including 648 cases of isolated ARM, 601 cases of ARM with additional congenital anomalies, and 168 cases of ARM-VACTERL (vertebral, anal, cardiac, tracheo-esophageal, renal, and limb defects), along with 13 371 controls with recognised syndromes or chromosomal abnormalities. METHODS: Multiple logistic regression analyses were used to calculate adjusted odds ratios (ORs) for potential risk factors for ARM, such as fertility treatment, multiple pregnancy, primiparity, maternal illnesses during pregnancy, and pregnancy-related complications. MAIN OUTCOME MEASURES: Adjusted ORs for pregnancy-related risk factors for ARM. RESULTS: The ARM cases were more likely to be firstborn than the controls (OR 1.6, 95% CI 1.4-1.8). Fertility treatment and being one of twins or triplets seemed to increase the risk of ARM in cases with additional congenital anomalies or VACTERL (ORs ranging from 1.6 to 2.5). Maternal fever during pregnancy and pre-eclampsia were only associated with ARM when additional congenital anomalies were present (OR 3.9, 95% CI 1.3-11.6; OR 3.4, 95% CI 1.6-7.1, respectively), whereas maternal epilepsy during pregnancy resulted in a five-fold elevated risk of all manifestations of ARM (OR 5.1, 95% CI 1.7-15.6). CONCLUSIONS: This large European study identified maternal epilepsy, fertility treatment, multiple pregnancy, primiparity, pre-eclampsia, and maternal fever during pregnancy as potential risk factors primarily for complex manifestations of ARM with additional congenital anomalies and VACTERL.
Resumo:
INTRODUCTION: The influence of specific health problems on health-related quality of life (HRQoL) in childhood cancer survivors is unknown. We compared HRQoL between survivors of childhood cancer and their siblings, determined factors associated with HRQoL, and investigated the influence of chronic health problems on HRQoL. METHODS: Within the Swiss Childhood Cancer Survivor Study, we sent a questionnaire to all survivors (≥16 years) registered in the Swiss Childhood Cancer Registry, who survived >5 years and were diagnosed 1976-2005 aged <16 years. Siblings received similar questionnaires. We assessed HRQoL using Short Form-36 (SF-36). Health problems from a standard questionnaire were classified into overweight, vision impairment, hearing, memory, digestive, musculoskeletal or neurological, and thyroid problems. RESULTS: The sample included 1,593 survivors and 695 siblings. Survivors scored significantly lower than siblings in physical function, role limitation, general health, and the Physical Component Summary (PCS). Lower score in PCS was associated with a diagnosis of central nervous system tumor, retinoblastoma or bone tumor, having had surgery, cranio-spinal irradiation, or bone marrow transplantation. Lower score in Mental Component Summary was associated with older age. All health problems decreased HRQoL in all scales. Most affected were survivors reporting memory problems and musculoskeletal or neurological problems. Health problems had the biggest impact on physical functioning, general health, and energy and vitality. CONCLUSIONS: In this study, we showed the negative impact of specific chronic health problems on survivors' HRQoL. IMPLICATIONS FOR CANCER SURVIVORS: Therapeutic preventive measures, risk-targeted follow-up, and interventions might help decrease health problems and, consequently, improve survivors' quality of life.
Resumo:
The whole body sweating response was measured at rest in eight women during the follicular (F) and the luteal (L) phases of the menstrual cycle. Subjects were exposed for 30-min to neutral (N) environmental conditions [ambient temperature (Ta) 28 degrees C] and then for 90-min to warm (W) environmental conditions (Ta, 35 degrees C) in a direct calorimeter. At the end of the N exposure, tympanic temperature (Tty) was 0.18 (SEM 0.06) degrees C higher in the L than in the F phase (P less than 0.05), whereas mean skin temperature (Tsk) was unchanged. During W exposure, the time to the onset of sweating as well as the concomitant increase in body heat content were similar in both phases. At the onset of sweating, the tympanic threshold temperature (Tty,thresh) was higher in the L phase [37.18 (SEM 0.08) degrees C] than in the F phase [36.95 (SEM 0.07) degrees C; P less than 0.01]. The magnitude of the shift in Tty,thresh [0.23 (SEM 0.07) degrees C] was similar to the L-F difference in Tty observed at the end of the N exposure. The mean skin threshold temperature was not statistically different between the two phases. The slope of the relationship between sweating rate and Tty was similar in F and L. It was concluded that the internal set point temperature of resting women exposed to warm environmental conditions shifted to a higher value during the L phase compared to the F phase of the menstrual cycle; and that the magnitude of the shift corresponded to the difference in internal temperature observed in neutral environmental conditions between the two phases.
Resumo:
We present a novel approach for analyzing single-trial electroencephalography (EEG) data, using topographic information. The method allows for visualizing event-related potentials using all the electrodes of recordings overcoming the problem of previous approaches that required electrode selection and waveforms filtering. We apply this method to EEG data from an auditory object recognition experiment that we have previously analyzed at an ERP level. Temporally structured periods were statistically identified wherein a given topography predominated without any prior information about the temporal behavior. In addition to providing novel methods for EEG analysis, the data indicate that ERPs are reliably observable at a single-trial level when examined topographically.
Resumo:
Although tremendous advances have been made in the diagnosis and treatment of patients, hospital administrative systems have progressed relatively slowly. The types of information available to managers in industrial sectors are not available in the health sector. For this reason, many phenomena, such as the variations of average costs and lengths of stay between different hospitals, have remained poorly explained.The DRG system defines groups of patients that consume relatively homogeneous quantities of hospital resources. On the basis, it is possible to standardize average lengths of stay and average hospital costs in terms of the differences in case mix treated. Thus DRGs can serve as an explanation of variations in these factors between different hospitals, and also (but not only) for prospective reimbursement schems. As in a number of other European countries, a project has been set up in Switzerland to examine the possibilities of using DRGs in hospital management, planning and financing.
Resumo:
Staphylococcus aureus is a major bovine mastitis pathogen. Although the reported antimicrobial resistance was generally low, the emergence of new genetic clusters in bovine mastitis requires examination of the link between antimicrobial resistance and genotypes. Here, amplified fragment length polymorphism (AFLP) profiles and standard antimicrobial resistance profiles were determined in order to characterize a total of 343 S. aureus cow mastitis isolates from two geographically close regions of Switzerland and France. AFLP profiles revealed similar population compositions in the two regions, with 4 major clusters (C8, C20, C97, and C151), but the proportions of isolates in each cluster significantly diverged between the two countries (P = 9.2 × 10⁻⁹). Antimicrobial resistance was overall low (< 5% resistance to all therapeutically relevant molecules), with the exception of penicillin resistance, which was detected in 26% of the isolates. Penicillin resistance proportions differed between clusters, with only 1 to 2% of resistance associated with C20 and C151 and up to 70% associated with bovine C97. The prevalence of C20 and C8 was unexpectedly high and requires further investigation into the mechanism of adaptation to the bovine host. The strong association of penicillin resistance with few clusters highlights the fact that the knowledge of local epidemiology is essential for rational choices of antimicrobial treatment in the absence of susceptibility testing. Taken together, these observations argue in favor of more routine scrutiny of antimicrobial resistance and antibiotic-resistant clones in cattle and the farm environment.
Resumo:
Travaux effectués dans le cadre de l'étude "Case Mix" menée par l'Institut universitaire de médecine sociale et préventive de Lausanne et le Service de la santé publique et de la planification sanitaire du canton de Vaud, en collaboration avec les cantons de Berne, Fribourg, Genève, Jura, Neuchâtel, Soleure, Tessin et Valais