26 resultados para time to contact
Resumo:
About 85% of multiple sclerosis (MS) cases start as clinically isolated syndrome (CIS).When patients present with a CIS, clinicians face with many questions, most of themrelated with prognosis and treatment. Thereby, patients with CIS have been focus ofresearch. Several studies have demonstrated a relationship between positive IgM lipidspecific oligoclonal band pattern in CSF and higher lesion load on MRI brain scan, higher number of relapses and greater disability, even at the first stages of the disease. On the other hand, no studies have used this previous evidence to treat with more aggressive disease modifying therapy in initial stages of disease course to prevent the earlier axonal damage. The aim of this study is to assess the most effective approved treatment for MS and current therapy for CIS patients presenting high risk to develop CDMS and with biomarkers of poor prognosis. Among this group of patients any disease activity will eventually lead to disability. Therefore, the earlier the treatment is initiated, the more effective to prevent disability will be. It is considered that “time lost is brain lost” and since once damage is established, there is no therapy to be regained later on. In this phase III clinical trial, 172 patients will be randomized 1:1 to receive Interferon β-1b or natalizumab over 96 weeks. Time to develop clinical definitive multiple sclerosis (CDMS) will be included as primary endpoint. Other secondary endpoints will include clinical data, magnetic resonance imaging (MRI) measurements and quality of life tests
Resumo:
About 85% of multiple sclerosis (MS) cases start as clinically isolated syndrome (CIS).When patients present with a CIS, clinicians face with many questions, most of themrelated with prognosis and treatment. Thereby, patients with CIS have been focus ofresearch. Several studies have demonstrated a relationship between positive IgM lipidspecific oligoclonal band pattern in CSF and higher lesion load on MRI brain scan, higher number of relapses and greater disability, even at the first stages of the disease. On the other hand, no studies have used this previous evidence to treat with more aggressive disease modifying therapy in initial stages of disease course to prevent the earlier axonal damage. The aim of this study is to assess the most effective approved treatment for MS and current therapy for CIS patients presenting high risk to develop CDMS and with biomarkers of poor prognosis. Among this group of patients any disease activity will eventually lead to disability. Therefore, the earlier the treatment is initiated, the more effective to prevent disability will be. It is considered that “time lost is brain lost” and since once damage is established, there is no therapy to be regained later on. In this phase III clinical trial, 172 patients will be randomized 1:1 to receive Interferon β-1b or natalizumab over 96 weeks. Time to develop clinical definitive multiple sclerosis (CDMS) will be included as primary endpoint. Other secondary endpoints will include clinical data, magnetic resonance imaging (MRI) measurements and quality of life tests
Resumo:
When a rubber hand is placed on a table top in a plausible position as if part of a person"s body, and is stroked synchronously with the person"s corresponding hidden real hand, an illusion of ownership over the rubber hand can occur (Botvinick and Cohen 1998). A similar result has been found with respect to a virtual hand portrayed in a virtual environment, a virtual hand illusion (Slater et al. 2008). The conditions under which these illusions occur have been the subject of considerable study. Here we exploited the flexibility of virtual reality to examine four contributory factors: visuo-tactile synchrony while stroking the virtual and the real arms, body continuity, alignment between the real and virtual arms, and the distance between them. We carried out three experiments on a total of 32 participants where these factors were varied. The results show that the subjective illusion of ownership over the virtual arm and the time to evoke this illusion are highly dependent on synchronous visuo-tactile stimulation and on connectivity of the virtual arm with the rest of the virtual body. The alignment between the real and virtual arms and the distance between these were less important. It was found that proprioceptive drift was not a sensitive measure of the illusion, but was only related to the distance between the real and virtual arms.
Resumo:
Peer-reviewed
Resumo:
The present study was designed to analyse the effect of the length of exposure to a long photoperiod imposed c. 3 weeks after sowing in spring wheat (cv. UQ189) and barley (cv. Arapiles) to (i) establish whether the response to the number of cycles of exposure is quantitative or qualitative, (ii) determine the existence of a commitment to particular stages well before the stage has been observable, and (iii) study the interrelationships between the effects on final leaf number and phyllochron when the stimulus is provided several days after seedling emergence. Both wheat and barley seemed to respond quantitatively to the number of long-day cycles they were exposed to. However, wheat showed a requirement of approximately 4 long-day cycles to be able to produce a significant response in time to heading. The barley cultivar used in the study was responsive to the minimum length of exposure. The response to extended photoperiod cycles during the stem elongation phase was due to the ‘ memory’ photoperiod effects being related, in the case of wheat, to the fact that the pre-terminal spikelet appearance phase saturated its photoperiod response well before that stage was reached. Therefore, the commitment to the terminal spikelet appearance in wheat may be reached well before this stage could be recognized. As the response in duration to heading exceeded that of the final leaf number, and the stem elongation phase responded to memory effects of photoperiod, the phyllochron of both cereals was responsive to the treatments accelerating the average phyllochron when exposed to longer periods of long days. The response in average phyllochron was due to a switch from bi-linear to linear models of leaf number v. time when the conditions were increasingly inductive, with the phyllochron of the initial (6–8) leaves being similar for all treatments (within each species), and from then on increased.
Resumo:
Differences amongst wheat cultivars in the rate of reproductive development are largely dependent on differences in their sensitivity to photoperiod and vernalization. However, when these responses are accounted for, by growing vernalized seedlings under long photoperiods, cultivars can still differ markedly in time to ear emergence. Control of rate of development by this ‘third factor’ has been poorly understood and is variously referred to as intrinsic earliness, earliness in the narrow sense, basic vegetative period, earliness per se, and basic development rate. Certain assumptions are made in the concept of intrinsic earliness. They are that differences in intrinsic earliness (i) are independent of the responses of the cultivars to photoperiod and vernalization, (ii) apply only to the length of the vegetative period up to floral initiation (as suggested by several authors), (iii) are maintained under different temperatures, measured either in days or degree days. As a consequence of this, the ranking of cultivars (from intrinsically early to intrinsically late) must be maintained at different temperatures. This paper, by the re-analysis of published data, examines the extent to which these assumptions can be supported. Although it is shown that intrinsic earliness operates independently of photoperiod and vernalization responses, the other assumptions were not supported. The differences amongst genotypes in time to ear emergence, grown under above-optimum vernalization and photoperiod (that is when the response to these factors is saturated), were not exclusively due to parallel differences in the length of the vegetative phase, and the length of the reproductive phase was independent of that of the vegetative phase. Thus, it would be possible to change the relative allocation of time to vegetative and reproductive periods with no change in the full period to ear emergence. The differences in intrinsic earliness between cultivars were modified by the temperature regime under which they were grown, i.e. the difference between cultivars (both considering the full phase to ear emergence or some sub-phases) was not a constant amount of time or thermal time at different temperatures. In addition, in some instances genotypes changed their ranking for ‘intrinsic earliness’ depending on the temperature regime. This was interpreted to mean that while all genotypes are sensitive to temperature they differ amongst themselves in the extent of that sensitivity. Therefore, ‘intrinsic earliness’ should not be considered as a static genotypic characteristic, but the result of the interaction between the genotype and temperature. Intrinsic earliness is therefore likely to be related to temperature sensitivity. Some implications of these conclusions for plant breeding and crop simulation modelling are discussed.
Resumo:
In this work, we present an integral scheduling system for non-dedicated clusters, termed CISNE-P, which ensures the performance required by the local applications, while simultaneously allocating cluster resources to parallel jobs. Our approach solves the problem efficiently by using a social contract technique. This kind of technique is based on reserving computational resources, preserving a predetermined response time to local users. CISNE-P is a middleware which includes both a previously developed space-sharing job scheduler and a dynamic coscheduling system, a time sharing scheduling component. The experimentation performed in a Linux cluster shows that these two scheduler components are complementary and a good coordination improves global performance significantly. We also compare two different CISNE-P implementations: one developed inside the kernel, and the other entirely implemented in the user space.
Resumo:
Everyday tasks seldom involve isolate actions but sequences of them. We can see whether previous actions influence the current one by exploring the response time to controlled sequences of stimuli. Specifically, depending on the response-stimulus temporal interval (RSI), different mechanisms have been proposed to explain sequential effects in two-choice serial response tasks. Whereas an automatic facilitation mechanism is thought to produce a benefit for response repetitions at short RSIs, subjective expectancies are considered to replace the automatic facilitation at longer RSIs, producing a cost-benefit pattern: repetitions are faster after other repetitions but they are slower after alternations. However, there is not direct evidence showing the impact of subjective expectancies on sequential effects. By using a fixed sequence, the results of the reported experiment showed that the repetition effect was enhanced in participants who acquired complete knowledge of the order. Nevertheless, a similar cost-benefit pattern was observed in all participants and in all learning blocks. Therefore, results of the experiment suggest that sequential effects, including the cost-benefit pattern, are the consequence of automatic mechanisms which operate independently of (and simultaneously with) explicit knowledge of the sequence or other subjective expectancies.
Resumo:
We previously reported that A. hydrophila GalU mutants were still able to produce UDP-glucose introduced as a glucose residue in their lipopolysaccharide core. In this study, we found the unique origin of this UDP-glucose from a branched α-glucan surface polysaccharide. This glucan, surface attached through the O-antigen ligase (WaaL), is common to the mesophilic Aeromonas strains tested. The Aeromonas glucan is produced by the action of the glycogen synthase (GlgA) and the UDP-Glc pyrophosphorylase (GlgC), the latter wrongly indicated as an ADP-Glc pyrophosphorylase in the Aeromonas genomes available. The Aeromonas glycogen synthase is able to react with UDP or ADP-glucose, which is not the case of E. coli glycogen synthase only reacting with ADP-glucose. The Aeromonas surface glucan has a role enhancing biofilm formation. Finally, for the first time to our knowledge, a clear preference on behalf of bacterial survival and pathogenesis is observed when choosing to produce one or other surface saccharide molecules to produce (lipopolysaccharide core or glucan).
Resumo:
This study evaluated the performance of the Tuberculin Skin Test (TST) and Quantiferon-TB Gold in-Tube (QFT) and the possible association of factors which may modify their results in young children (0-6 years) with recent contact with an index tuberculosis case. Materials and Methods: A cross-sectional study including 135 children was conducted in Manaus, Amazonas-Brazil. The TST and QFT were performed and the tests results were analyzed in relation to the personal characteristics of the children studied and their relationship with the index case. Results: The rates of positivity were 34.8% (TST) and 26.7% (QFT), with 14.1% of indeterminations by the QFT. Concordance between tests was fair (Kappa = 0.35 P<0.001). Both the TST and QFT were associated with the intensity of exposure (Linear OR = 1.286, P = 0.005; Linear OR = 1.161, P = 0.035 respectively) with only the TST being associated with the time of exposure (Linear OR = 1.149, P = 0.009). The presence of intestinal helminths in the TST+ group was associated with negative QFT results (OR = 0.064, P = 0.049). In the TST- group lower levels of ferritin were associated with QFT+ results (Linear OR = 0.956, P = 0.036). Conclusions: Concordance between the TST and QFT was lower than expected. The factors associated with the discordant results were intestinal helminths, ferritin levels and exposure time to the index tuberculosis case. In TST+ group, helminths were associated with negative QFT results suggesting impaired cell-mediated immunity. The TST-&QFT+ group had a shorter exposure time and lower ferritin levels, suggesting that QFT is faster and ferritin may be a potential biomarker of early stages of tuberculosis infection.
Resumo:
BACKGROUND: With many atypical antipsychotics now available in the market, it has become a common clinical practice to switch between atypical agents as a means of achieving the best clinical outcomes. This study aimed to examine the impact of switching from olanzapine to risperidone and vice versa on clinical status and tolerability outcomes in outpatients with schizophrenia in a naturalistic setting. METHODS: W-SOHO was a 3-year observational study that involved over 17,000 outpatients with schizophrenia from 37 countries worldwide. The present post hoc study focused on the subgroup of patients who started taking olanzapine at baseline and subsequently made the first switch to risperidone (n=162) and vice versa (n=136). Clinical status was assessed at the visit when the first switch was made (i.e. before switching) and after switching. Logistic regression models examined the impact of medication switch on tolerability outcomes, and linear regression models assessed the association between medication switch and change in the Clinical Global Impression-Schizophrenia (CGI-SCH) overall score or change in weight. In addition, Kaplan-Meier survival curves and Cox-proportional hazards models were used to analyze the time to medication switch as well as time to relapse (symptom worsening as assessed by the CGI-SCH scale or hospitalization). RESULTS: 48% and 39% of patients switching to olanzapine and risperidone, respectively, remained on the medication without further switches (p=0.019). Patients switching to olanzapine were significantly less likely to experience relapse (hazard ratio: 3.43, 95% CI: 1.43, 8.26), extrapyramidal symptoms (odds ratio [OR]: 4.02, 95% CI: 1.49, 10.89) and amenorrhea/galactorrhea (OR: 8.99, 95% CI: 2.30, 35.13). No significant difference in weight change was, however, found between the two groups. While the CGI-SCH overall score improved in both groups after switching, there was a significantly greater change in those who switched to olanzapine (difference of 0.29 points, p=0.013). CONCLUSION: Our study showed that patients who switched from risperidone to olanzapine were likely to experience a more favorable treatment course than those who switched from olanzapine to risperidone. Given the nature of observational study design and small sample size, additional studies are warranted.