23 resultados para Delay in the tetrad dissociation
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Low-grade gliomas (LGGs) are a group of primary brain tumours usually encountered in young patient populations. These tumours represent a difficult challenge because many patients survive a decade or more and may be at a higher risk for treatment-related complications. Specifically, radiation therapy is known to have a relevant effect on survival but in many cases it can be deferred to avoid side effects while maintaining its beneficial effect. However, a subset of LGGs manifests more aggressive clinical behaviour and requires earlier intervention. Moreover, the effectiveness of radiotherapy depends on the tumour characteristics. Recently Pallud et al. (2012. Neuro-Oncology, 14: , 1-10) studied patients with LGGs treated with radiation therapy as a first-line therapy and obtained the counterintuitive result that tumours with a fast response to the therapy had a worse prognosis than those responding late. In this paper, we construct a mathematical model describing the basic facts of glioma progression and response to radiotherapy. The model provides also an explanation to the observations of Pallud et al. Using the model, we propose radiation fractionation schemes that might be therapeutically useful by helping to evaluate tumour malignancy while at the same time reducing the toxicity associated to the treatment.
Resumo:
During a two-stage revision for prosthetic joint infections (PJI), joint aspirations, open tissue sampling and serum inflammatory markers are performed before re-implantation to exclude ongoing silent infection. We investigated the performance of these diagnostic procedures on the risk of recurrence of PJI among asymptomatic patients undergoing a two-stage revision. A total of 62 PJI were found in 58 patients. All patients had intra-operative surgical exploration during re-implantation, and 48 of them had intra-operative microbiological swabs. Additionally, 18 joint aspirations and one open biopsy were performed before second-stage reimplantation. Recurrence or persistence of PJI occurred in 12 cases with a mean delay of 218 days after re-implantation, but only four pre- or intraoperative invasive joint samples had grown a pathogen in cultures. In at least seven recurrent PJIs (58%), patients had a normal C-reactive protein (CRP, < 10 mg/l) level before re-implantation. The sensitivity, specificity, positive predictive and negative predictive values of pre-operative invasive joint aspiration and CRP for the prediction of PJI recurrence was 0.58, 0.88, 0.5, 0.84 and 0.17, 0.81, 0.13, 0.86, respectively. As a conclusion, pre-operative joint aspiration, intraoperative bacterial sampling, surgical exploration and serum inflammatory markers are poor predictors of PJI recurrence. The onset of reinfection usually occurs far later than reimplantation.
Resumo:
Prospective systematic analyses of the clinical presentation of bullous pemphigoid (BP) are lacking. Little is known about the time required for its diagnosis. Knowledge of the disease spectrum is important for diagnosis, management and inclusion of patients in therapeutic trials.
Resumo:
Despite the introduction of new immunosuppressive agents, a steady decline of functioning renal allografts after living donation is observed. Thus nonpharmacological strategies to prevent graft loss have to be reconsidered, including donor-specific transfusions (DST). We introduced a cyclosporine-based DST protocol for renal allograft recipients from living-related/unrelated donation. From 1993 to 2003, 200 ml of whole blood, or the respective mononuclear cells from the potential living donor were administered twice to all of our 61 recipient candidates. The transplanted subjects were compared with three groups of patients without DST from the Collaborative Transplant Study (Heidelberg, Germany) during a 6-year period. Six patients were sensitized without delay for a subsequent cadaveric kidney. DST patients had less often treatment for rejection and graft survival was superior compared with subjects from the other Swiss transplant centers (n = 513) or from Western Europe (n = 7024). To diminish the probability that superior results reflect patient selection rather than effects of DST, a 'matched-pair' analysis controlling for relevant factors of transplant outcome was performed. Again, this analysis indicated that recipients with DST had better outcome. Thus, our observation suggests that DST improve the outcome of living kidney transplants even when modern immunosuppressive drugs are prescribed.
Resumo:
BACKGROUND/AIMS: We investigated the molecular response of a non-ischemic hypoxic stress in the liver, in particular, to distinguish its hepatoprotective potential. METHODS: The livers of mice were subjected to non-ischemic hypoxia by clamping the hepatic-artery (HA) for 2h while maintaining portal circulation. Hypoxia was defined by a decrease in oxygen saturation, the activation of hypoxia-inducible factor (HIF)-1 and the mRNA up-regulation of responsive genes. To demonstrate that the molecular response to hypoxia may in part be hepatoprotective, pre-conditioned animals were injected with an antibody against Fas (Jo2) to induce acute liver failure. Hepatocyte apoptosis was monitored by caspase-3 activity, cleavage of lamin A and animal survival. RESULTS: Clamping the HA induced a hypoxic stress in the liver in the absence of severe metabolic distress or tissue damage. The hypoxic stimulus was sufficient to activate the HIF-1 signalling pathway and up-regulate hepatoprotective genes. Pre-conditioning the liver with hypoxia was able to delay the onset of Fas-mediated apoptosis and prolong animal survival. CONCLUSIONS: Our data reveal that hepatic cells can sense and respond to a decrease in tissue oxygenation, and furthermore, that activation of hypoxia-inducible signalling pathways function in part to promote liver cell survival.
Resumo:
In the last decade, there has been an increasing interest in cognitive alterations during the early course of schizophrenia. From a clinical perspective, a better understanding of cognitive functioning in putative at-risk states for schizophrenia is essential for developing optimal early intervention models. Two approaches have more recently been combined to assess the entire course of the initial schizophrenia prodrome: the predictive "basic symptom at-risk" (BS) and the ultra high-risk (UHR) criteria. Basic symptoms are considered to be present during the entire disease progression, including the initial prodrome, while the onset of symptoms captured by the UHR criteria expresses further disease progression toward frank psychosis. The present study investigated the cognitive functioning in 93 subjects who met either BS or UHR criteria and thus were assumed to be at different points on the putative trajectory to psychosis. We compared them with 43 patients with a first episode of psychosis and to 49 help-seeking patient controls. All groups performed significantly below normative values. Both at-risk groups performed at intermediate levels between the first-episode (FE) group and normative values. The UHR group demonstrated intermediate performance between the FE and BS groups. Overall, auditory working memory, verbal fluency/processing speed, and declarative verbal memory were impaired the most. Our results suggest that cognitive impairments may still be modest in the early stages of the initial schizophrenia prodrome and thus support current efforts to intervene in the early course of impending schizophrenia because early intervention may prevent or delay the onset of frank psychosis and thus prevent further cognitive damage.
Resumo:
To study the hypothesis that a delay in the diagnosis of paediatric brain tumours results in decreased survival outcome probability, we compared the prediagnostic period of 315 brain tumour patients (median age 6.7 years, range, 0 to 16 years) with progression-free and overall survival. The median prediagnostic symptomatic interval was 60 days (range, 0 to 3,480 days), with a median parental delay of 14 days (range, 0 to 1,835 days) and a median doctor's delay of 14 days (range, 0 to 3,480 days). The prediagnostic symptomatic interval correlated significantly with the patient age, tumour histology, tumour location and year of diagnosis, but not with gender. We then grouped the patients according to histology (low-grade glioma [n=77], medulloblastoma [n=57], high-grade glioma [n=40], craniopharyngioma [n=27], ependymoma [n=20] and germ cell tumours [n=18]). Contrary to common belief, long prediagnostic symptomatic interval or long doctor's delay did not result in decreased survival outcome probability in any of these groups. The effect of tumour biology on survival seems to be dominant and overwhelms any possible opposing effect on survival of a delay in diagnosis.
Resumo:
At the 111th German Medical Assembly in May 2008 in Ulm, Germany, a public debate on rationing of health care performances was started. Since the money in the German health care system is not enough to provide every diagnostic or therapy for every patient as a coverage of the compulsory medical insurances, a lot of specific health care performances have been rationed during the last years not to be covered by the regular medical insurance any more, such as, e. g., PSA measurements in urology or IOP measurements in ophthalmology. In contrast to the health care system in Scandinavia, where rationing of health care performances is publicly documented by the government, no similar public statements exist in Germany. Due to this, it is left to physicians to explain to their patients the "hidden" rationing of public health care performances, which also leads to an increase in individual health care performances (IGeL in Germany) to be paid for privately by the patient. It is undoubtedly true that not all medically possible performances need to be paid for by the health insurance; however, an official determination of these "out of pocket" health care performances is necessary. Therefore, it was the aim herein to work out possible "stop" criteria--according to the Scandinavian system--for common eye diseases and consecutive therapies, which need not be paid for or only be paid after a delay by the health insurances.
Resumo:
BACKGROUND: Alcohol consumption leading to morbidity and mortality affects HIV-infected individuals. Here, we aimed to study self-reported alcohol consumption and to determine its association with adherence to antiretroviral therapy (ART) and HIV surrogate markers. METHODS: Cross-sectional data on daily alcohol consumption from August 2005 to August 2007 were analysed and categorized according to the World Health Organization definition (light, moderate or severe health risk). Multivariate logistic regression models and Pearson's chi(2) statistics were used to test the influence of alcohol use on endpoints. RESULTS: Of 6,323 individuals, 52.3% consumed alcohol less than once a week in the past 6 months. Alcohol intake was deemed light in 39.9%, moderate in 5.0% and severe in 2.8%. Higher alcohol consumption was significantly associated with older age, less education, injection drug use, being in a drug maintenance programme, psychiatric treatment, hepatitis C virus coinfection and with a longer time since diagnosis of HIV. Lower alcohol consumption was found in males, non-Caucasians, individuals currently on ART and those with more ART experience. In patients on ART (n=4,519), missed doses and alcohol consumption were positively correlated (P<0.001). Severe alcohol consumers, who were pretreated with ART, were more often off treatment despite having CD4+ T-cell count <200 cells/microl; however, severe alcohol consumption per se did not delay starting ART. In treated individuals, alcohol consumption was not associated with worse HIV surrogate markers. CONCLUSIONS: Higher alcohol consumption in HIV-infected individuals was associated with several psychosocial and demographic factors, non-adherence to ART and, in pretreated individuals, being off treatment despite low CD4+ T-cell counts.
Resumo:
OBJECTIVES The impact of diagnostic delay (a period from appearance of first symptoms to diagnosis) on the clinical course of Crohn's disease (CD) is unknown. We examined whether length of diagnostic delay affects disease outcomes. METHODS Data from the Swiss IBD cohort study were analyzed. Patients were recruited from university centers (68%), regional hospitals (14%), and private practices (18%). The frequencies of occurrence of bowel stenoses, internal fistulas, perianal fistulas, and CD-related surgery (intestinal and perianal) were analyzed. RESULTS A total of 905 CD patients (53.4% female, median age at diagnosis 26 (20-36) years) were stratified into four groups according to the quartiles of diagnostic delay (0-3, 4-9, 10-24, and ≥25 months, respectively). Median diagnostic delay was 9 (3-24) months. The frequency of immunomodulator and/or antitumor necrosis factor drug use did not differ among the four groups. The length of diagnostic delay was positively correlated with the occurrence of bowel stenosis (odds ratio (OR) 1.76, P=0.011 for delay of ≥25 months) and intestinal surgery (OR 1.76, P=0.014 for delay of 10-24 months and OR 2.03, P=0.003 for delay of ≥25 months). Disease duration was positively associated and non-ileal disease location was negatively associated with bowel stenosis (OR 1.07, P<0.001, and OR 0.41, P=0.005, respectively) and intestinal surgery (OR 1.14, P<0.001, and OR 0.23, P<0.001, respectively). CONCLUSIONS The length of diagnostic delay is correlated with an increased risk of bowel stenosis and CD-related intestinal surgery. Efforts should be undertaken to shorten the diagnostic delay.
Resumo:
BACKGROUND & AIMS Development of strictures is a major concern for patients with eosinophilic esophagitis (EoE). At diagnosis, EoE can present with an inflammatory phenotype (characterized by whitish exudates, furrows, and edema), a stricturing phenotype (characterized by rings and stenosis), or a combination of these. Little is known about progression of stricture formation; we evaluated stricture development over time in the absence of treatment and investigated risk factors for stricture formation. METHODS We performed a retrospective study using the Swiss EoE Database, collecting data on 200 patients with symptomatic EoE (153 men; mean age at diagnosis, 39 ± 15 years old). Stricture severity was graded based on the degree of difficulty associated with passing of the standard adult endoscope. RESULTS The median delay in diagnosis of EoE was 6 years (interquartile range, 2-12 years). With increasing duration of delay in diagnosis, the prevalence of fibrotic features of EoE, based on endoscopy, increased from 46.5% (diagnostic delay, 0-2 years) to 87.5% (diagnostic delay, >20 years; P = .020). Similarly, the prevalence of esophageal strictures increased with duration of diagnostic delay, from 17.2% (diagnostic delay, 0-2 years) to 70.8% (diagnostic delay, >20 years; P < .001). Diagnostic delay was the only risk factor for strictures at the time of EoE diagnosis (odds ratio = 1.08; 95% confidence interval: 1.040-1.122; P < .001). CONCLUSIONS The prevalence of esophageal strictures correlates with the duration of untreated disease. These findings indicate the need to minimize delay in diagnosis of EoE.
Resumo:
'Sensing the self' relies on the ability to distinguish self-generated from external stimuli. It requires functioning mechanisms to establish feelings of agency and ownership. Agency is defined causally, where the subjects action is followed by an effect. Ownership is defined by the features of the effect, independent from the action. In our study, we manipulated these qualities separately. 13 right-handed healthy individuals performed the experiment while 76-channel EEG was recorded. Stimuli consisted of visually presented words, read aloud by the subject. The experiment consisted of six conditions: (a) subjects saw a word, read it aloud, heard it in their own voice; (b) like a, but the word was heard in an unfamiliar voice; (c) subject heard a word in his/her own voice without speaking; (d) like c, but the word was heard in an unfamiliar voice; (e) like a, but subjects heard the word with a delay; (f) subjects read without hearing. ERPs and difference maps were computed for all conditions. Effects were analysed topographically. The N100 (86-172 ms) displayed significant main effects of agency and ownership. The topographies of the two effects shared little common variance, suggesting independent effects. Later effects (174-400 ms) of agency and ownership were topographically similar, suggesting common mechanisms. Replicating earlier studies, significant N100 suppression was observed, with a topography resembling the agency effect. 'Sensing the self' appears to recruit from at least two very distinct processes: an agency assessment that represents causality and an ownership assessment that compares stimulus features with memory content.
Resumo:
Adding to the on-going debate regarding vegetation recolonisation (more particularly the timing) in Europe and climate change since the Lateglacial, this study investigates a long sediment core (LL081) from Lake Ledro (652ma.s.l., southern Alps, Italy). Environmental changes were reconstructed using multiproxy analysis (pollen-based vegetation and climate reconstruction, lake levels, magnetic susceptibility and X-ray fluorescence (XRF) measurements) recorded climate and land-use changes during the Lateglacial and early-middle Holocene. The well-dated and high-resolution pollen record of Lake Ledro is compared with vegetation records from the southern and northern Alps to trace the history of tree species distribution. An altitudedependent progressive time delay of the first continuous occurrence of Abies (fir) and of the Larix (larch) development has been observed since the Lateglacial in the southern Alps. This pattern suggests that the mid-altitude Lake Ledro area was not a refuge and that trees originated from lowlands or hilly areas (e.g. Euganean Hills) in northern Italy. Preboreal oscillations (ca. 11 000 cal BP), Boreal oscillations (ca. 10 200, 9300 cal BP) and the 8.2 kyr cold event suggest a centennial-scale climate forcing in the studied area. Picea (spruce) expansion occurred preferentially around 10 200 and 8200 cal BP in the south-eastern Alps, and therefore reflects the long-lasting cumulative effects of successive boreal and the 8.2 kyr cold event. The extension of Abies is contemporaneous with the 8.2 kyr event, but its development in the southern Alps benefits from the wettest interval 8200-7300 cal BP evidenced in high lake levels, flood activity and pollen-based climate reconstructions. Since ca. 7500 cal BP, a weak signal of pollen-based anthropogenic activities suggest weak human impact. The period between ca. 5700 and ca. 4100 cal BP is considered as a transition period to colder and wetter conditions (particularly during summers) that favoured a dense beech (Fagus) forest development which in return caused a distinctive yew (Taxus) decline.We conclude that climate was the dominant factor controlling vegetation changes and erosion processes during the early and middle Holocene (up to ca. 4100 cal BP).
Resumo:
Recent downward revisions in the climate response to rising CO2 levels, and opportunities for reducing non-CO2 climate warming, have both been cited as evidence that the case for reducing CO2 emissions is less urgent than previously thought. Evaluating the impact of delay is complicated by the fact that CO2 emissions accumulate over time, so what happens after they peak is as relevant for long-term warming as the size and timing of the peak itself. Previous discussions have focused on how the rate of reduction required to meet any given temperature target rises asymptotically the later the emissions peak. Here we focus on a complementary question: how fast is peak CO2-induced warming increasing while mitigation is delayed, assuming no increase in rates of reduction after the emissions peak? We show that this peak-committed warming is increasing at the same rate as cumulative CO2 emissions, about 2% per year, much faster than observed warming, independent of the climate response.
Resumo:
While keto-amino cytosine is the dominant species in aqueous solution, spectroscopic studies in molecular beams and in noble gas matrices show that other cytosine tautomers prevail in apolar environments. Each of these offers two or three H-bonding sites (Watson–Crick, wobble, sugar-edge). The mass- and isomer-specific S1 ← S0 vibronic spectra of cytosine·2-pyridone (Cyt·2PY) and 1-methylcytosine·2PY are measured using UV laser resonant two-photon ionization (R2PI), UV/UV depletion, and IR depletion spectroscopy. The UV spectra of the Watson–Crick and sugar-edge isomers of Cyt·2PY are separated using UV/UV spectral hole-burning. Five different isomers of Cyt·2PY are observed in a supersonic beam. We show that the Watson–Crick and sugar-edge dimers of keto-amino cytosine with 2PY are the most abundant in the beam, although keto-amino-cytosine is only the third most abundant tautomer in the gas phase. We identify the different isomers by combining three different diagnostic tools: (1) methylation of the cytosine N1–H group prevents formation of both the sugar-edge and wobble isomers and gives the Watson–Crick isomer exclusively. (2) The calculated ground state binding and dissociation energies, relative gas-phase abundances, excitation and the ionization energies are in agreement with the assignment of the dominant Cyt·2PY isomers to the Watson–Crick and sugar-edge complexes of keto-amino cytosine. (3) The comparison of calculated ground state vibrational frequencies to the experimental IR spectra in the carbonyl stretch and NH/OH/CH stretch ranges strengthen this identification.