976 resultados para Trail Making Test
Resumo:
Making research relevant to development is a complex, non-linear and often unpredictable process which requires very particular skills and strategies on the part of researchers. The National Centre of Competence in Research (NCCR) North-South provides financial and technical support for researchers so that they can effectively cooperate with policy-makers and practitioners. An analysis of 10 years of experience translating research into development practise in the NCCR North-South revealed the following four strategies as particularly relevant: a) research orientation towards the needs and interests of partners; b) implementation of promising methods and approaches; c) communication and dissemination of research results; and d) careful analysis of the political context through monitoring and learning approaches. The NCCR North-South experience shows that “doing excellent research” is just one piece of the mosaic. It is equally important to join hands with non-academic partners from the very beginning of a research project, in order to develop and test new pathways for sustainable development. Capacity building – in the North and South – enables researchers to do both: To do excellent research and to make it relevant for development.
Resumo:
Previous work has reported that in the Iowa gambling task (IGT) advantageous decisions may be taken before the advantageous strategy is known [Bechara, A., Damasio, H., Tranel, D., ; Damasio, A. R. (1997). Deciding advantageously before knowing the advantageous strategy. Science, 275, 1293-1295]. In order to test whether explicit memory is essential for the acquisition of a behavioural preference for advantageous choices, we measured behavioural performance and skin conductance responses (SCRs) in five patients with dense amnesia following damage to the basal forebrain and orbitofrontal cortex, six amnesic patients with damage to the medial temporal lobe or the diencephalon, and eight control subjects performing the IGT. Across 100 trials healthy participants acquired a preference for advantageous choices and generated large SCRs to high levels of punishment. In addition, their anticipatory SCRs to disadvantageous choices were larger than to advantageous choices. However, this dissociation occurred much later than the behavioural preference for advantageous alternatives. In contrast, though exhibiting discriminatory autonomic SCRs to different levels of punishment, 9 of 11 amnesic patients performed at chance and did not show differential anticipatory SCRs to advantageous and disadvantageous choices. Further, the magnitude of anticipatory SCRs did not correlate with behavioural performance. These results suggest that the acquisition of a behavioural preference--be it for advantageous or disadvantageous choices--depends on the memory of previous reinforcements encountered in the task, a capacity requiring intact explicit memory.
Resumo:
BACKGROUND: Mast cells activation through FcepsilonRI cross-linking has a pivotal role in the initiation of allergic reactions. The influence of this activation on programmed cell death of human mast cells has not yet been clarified. This study evaluates the influence of IgE-dependent activation alone and in synergy with TRAIL on the expression of molecules involved in the apoptotic signal transduction. METHODS: Human cord blood derived mast cells (CBMC) were cultured with myeloma IgE followed by activation with anti-human IgE. The expression of proteins involved in apoptotic signal transduction was assessed by immunoblot analysis. To test the effect of activation on a pro-apoptotic stimulus, activated, IgE-treated and resting CBMC were incubated with TRAIL, or in a medium with suboptimal concentrations of stem cell factor (SCF). RESULTS: In accordance with a previous study of ours, it was found that IgE-dependent activation increased TRAIL-induced caspase-8 and caspase-3 cleavage. However, it did not have a significant influence on CBMC death induced by SCF withdrawal. IgE-dependent activation increased the expression of FLIP and myeloid cell leukemia 1 (MCL-1) anti-apoptotic molecules as well as the pro-apoptotic one, BIM. In addition, a decrease in BID expression was observed. TRAIL could reverse the increase in FLIP but did not influence the upregulation of MCL-1 and of BIM. CONCLUSIONS: These findings suggest that IgE-dependent activation of human mast cells induces an increase in both pro-survival and pro-apoptotic molecules. We therefore hypothesized that IgE-dependent activation may regulate human mast cell apoptosis by fine-tuning anti-apoptotic and pro-apoptotic factors.
Resumo:
During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.
Resumo:
The death ligand members of the tumor necrosis factor (TNF) family are potent inducers of apoptosis in a variety of cell types. In particular, TNF-related apoptosis-inducing ligand (TRAIL) has recently received much scientific and commercial attention because of its potent tumor cell-killing activity while leaving normal untransformed cells mostly unaffected. Furthermore, TRAIL strongly synergizes with conventional chemotherapeutic drugs in inducing tumor cell apoptosis, making it a most promising candidate for future cancer therapy. Increasing evidence indicates, however, that TRAIL may also induce or modulate apoptosis in primary cells. A particular concern is the potential side effect of TRAIL-based tumor therapies in the liver. In this review we summarize some of the recent findings on the role of TRAIL in tumor cell and hepatocyte apoptosis.
Resumo:
This study investigated the effects of patient variables (physical and cognitive disability, significant others' preference and social support) on nurses' nursing home placement decision-making and explored nurses' participation in the decision-making process.^ The study was conducted in a hospital in Texas. A sample of registered nurses on units that refer patients for nursing home placement were asked to review a series of vignettes describing elderly patients that differed in terms of the study variables and indicate the extent to which they agreed with nursing home placement on a five-point Likert scale. The vignettes were judged to have good content validity by a group of five colleagues (expert consultants) and test-retest reliability based on the Pearson correlation coefficient was satisfactory (average of.75) across all vignettes.^ The study tested the following hypotheses: Nurses have more of a propensity to recommend placement when (1) patients have severe physical disabilities; (2) patients have severe cognitive disabilities; (3) it is the significant others' preference; and (4) patients have no social support nor alternative services. Other hypotheses were that (5) a nurse's characteristics and extent of participation will not have a significant effect on their placement decision; and (6) a patient's social support is the most important, single factor, and the combination of factors of severe physical and cognitive disability, significant others' preference, and no social support nor alternative services will be the most important set of predictors of a nurse's placement decision.^ Analysis of Variance (ANOVA) was used to analyze the relationships implied in the hypothesis. A series of one-way ANOVA (bivariate analyses) of the main effects supported hypotheses one-five.^ Overall, the n-way ANOVA (multivariate analyses) of the main effects confirmed that social support was the most important single factor controlling for other variables. The 4-way interaction model confirmed that the most predictive combination of patient characteristics were severe physical and cognitive disability, no social support and the significant others did not desire placement. These analyses provided an understanding of the importance of the influence of specific patient variables on nurses' recommendations regarding placement. ^
Resumo:
Based on the Attentional Control Theory (ACT; Eysenck et al., 2007), performance efficiency is decreased in high-anxiety situations because worrying thoughts compete for attentional resources. A repeated-measures design (high/low state anxiety and high/low perceptual task demands) was used to test ACT explanations. Complex football situations were displayed to expert and non-expert football players in a decision making task in a controlled laboratory setting. Ratings of state anxiety and pupil diameter measures were used to check anxiety manipulations. Dependent variables were verbal response time and accuracy, mental effort ratings and visual search behavior (e.g., visual search rate). Results confirmed that an anxiety increase, indicated by higher state-anxiety ratings and larger pupil diameters, reduced processing efficiency for both groups (higher response times and mental effort ratings). Moreover, high task demands reduced the ability to shift attention between different locations for the expert group in the high anxiety condition only. Since particularly experts, who were expected to use more top-down strategies to guide visual attention under high perceptual task demands, showed less attentional shifts in the high compared to the low anxiety condition, as predicted by ACT, anxiety seems to impair the shifting function by interrupting the balance between top-down and bottom-up processes.
Resumo:
Retinal detachment is a common ophthalmologic procedure, and outcome is typically measured by a single factor-improvement in visual acuity. Health related functional outcome testing, which quantifies patient's self-reported perception of impairment, can be integrated with objective clinical findings. Based on the patient's self-assessed lifestyle impairment, the physician and patient together can make an informed decision on the treatment that is most likely to benefit the patient. ^ A functional outcome test (the Houston Vision Assessment Test-Retina; HVAT-Retina) was developed and validated in patients with multiple retinal detachments in the same eye. The HVAT-Retina divides an estimated total impairment into subcomponents: contribution of visual disability (potentially correctable by retinal detachment surgery) and nonvisual physical disabilities (co-morbidities not affected by retinal detachment surgery. ^ Seventy-six patients participated in this prospective multicenter study. Seven patients were excluded from the analysis because they were not certain of their answers. Cronbach's alpha coefficient was 0.91 for presurgery HVAT-Retina and 0.94 post-surgery. The item-to-total correlation ranged from 0.50 to 0.88. Visual impairment score improved by 9 points from pre-surgery (p = 0.0003). Physical impairment score also improved from pre-surgery (p = 0.0002). ^ In conclusion, the results of this study demonstrate that the instrument is reliable and valid in patients presenting with recurrent retinal detachments. The HVAT-Retina is a simple instrument and does not burden the patient or the health professional in terms of time or cost. It may be self-administrated, not requiring an interviewer. Because the HVAT-Retina was designed to demonstrate outcomes perceivable by the patient, it has the potential to guide the decision making process between patient and physician. ^
Resumo:
This study developed proxy measures to test the independent effects of medical specialty, institutional ethics committee (IEC) and the interaction between the two, upon a proxy for the dependent variable of the medical decision to withhold/withdraw care for the dying--the resuscitation index (R-index). Five clinical vignettes were constructed and validated to convey the realism and contextual factors implicit in the decision to withhold/withdraw care. A scale was developed to determine the range of contact by an IEC in terms of physician knowledge and use of IEC policy.^ This study was composed of a sample of 215 physicians in a teaching hospital in the Southwest where proxy measures were tested for two competing influences, medical specialty and IEC, which alternately oppose and support the decision to withhold/withdraw care for the dying. A sub-sample of surgeons supported the hypothesis that an IEC is influential in opposing the medical training imperative to prolong life.^ Those surgeons with a low IEC score were 326 percent more likely to continue care than were surgeons with a high IEC score when compared to all other specialties. IEC alone was also found to significantly predict the decision to withhold/withdraw care. Interaction of IEC with the specialty of surgery was found to be the best predictor for a decision to withhold/withdraw care for the dying. ^
Resumo:
Se va a realizar un estudio de la codificación de imágenes sobre el estándar HEVC (high-effiency video coding). El proyecto se va a centrar en el codificador híbrido, más concretamente sobre la aplicación de la transformada inversa del coseno que se realiza tanto en codificador como en el descodificador. La necesidad de codificar vídeo surge por la aparición de la secuencia de imágenes como señales digitales. El problema principal que tiene el vídeo es la cantidad de bits que aparecen al realizar la codificación. Como consecuencia del aumento de la calidad de las imágenes, se produce un crecimiento exponencial de la cantidad de información a codificar. La utilización de las transformadas al procesamiento digital de imágenes ha aumentado a lo largo de los años. La transformada inversa del coseno se ha convertido en el método más utilizado en el campo de la codificación de imágenes y video. Las ventajas de la transformada inversa del coseno permiten obtener altos índices de compresión a muy bajo coste. La teoría de las transformadas ha mejorado el procesamiento de imágenes. En la codificación por transformada, una imagen se divide en bloques y se identifica cada imagen a un conjunto de coeficientes. Esta codificación se aprovecha de las dependencias estadísticas de las imágenes para reducir la cantidad de datos. El proyecto realiza un estudio de la evolución a lo largo de los años de los distintos estándares de codificación de video. Se analiza el codificador híbrido con más profundidad así como el estándar HEVC. El objetivo final que busca este proyecto fin de carrera es la realización del núcleo de un procesador específico para la ejecución de la transformada inversa del coseno en un descodificador de vídeo compatible con el estándar HEVC. Es objetivo se logra siguiendo una serie de etapas, en las que se va añadiendo requisitos. Este sistema permite al diseñador hardware ir adquiriendo una experiencia y un conocimiento más profundo de la arquitectura final. ABSTRACT. A study about the codification of images based on the standard HEVC (high-efficiency video coding) will be developed. The project will be based on the hybrid encoder, in particular, on the application of the inverse cosine transform, which is used for the encoder as well as for the decoder. The necessity of encoding video arises because of the appearance of the sequence of images as digital signals. The main problem that video faces is the amount of bits that appear when making the codification. As a consequence of the increase of the quality of the images, an exponential growth on the quantity of information that should be encoded happens. The usage of transforms to the digital processing of images has increased along the years. The inverse cosine transform has become the most used method in the field of codification of images and video. The advantages of the inverse cosine transform allow to obtain high levels of comprehension at a very low price. The theory of the transforms has improved the processing of images. In the codification by transform, an image is divided in blocks and each image is identified to a set of coefficients. This codification takes advantage of the statistic dependence of the images to reduce the amount of data. The project develops a study of the evolution along the years of the different standards in video codification. In addition, the hybrid encoder and the standard HEVC are analyzed more in depth. The final objective of this end of degree project is the realization of the nucleus from a specific processor for the execution of the inverse cosine transform in a decoder of video that is compatible with the standard HEVC. This objective is reached following a series of stages, in which requirements are added. This system allows the hardware designer to acquire a deeper experience and knowledge of the final architecture.
Resumo:
This paper proposes an adaptive algorithm for clustering cumulative probability distribution functions (c.p.d.f.) of a continuous random variable, observed in different populations, into the minimum homogeneous clusters, making no parametric assumptions about the c.p.d.f.’s. The distance function for clustering c.p.d.f.’s that is proposed is based on the Kolmogorov–Smirnov two sample statistic. This test is able to detect differences in position, dispersion or shape of the c.p.d.f.’s. In our context, this statistic allows us to cluster the recorded data with a homogeneity criterion based on the whole distribution of each data set, and to decide whether it is necessary to add more clusters or not. In this sense, the proposed algorithm is adaptive as it automatically increases the number of clusters only as necessary; therefore, there is no need to fix in advance the number of clusters. The output of the algorithm are the common c.p.d.f. of all observed data in the cluster (the centroid) and, for each cluster, the Kolmogorov–Smirnov statistic between the centroid and the most distant c.p.d.f. The proposed algorithm has been used for a large data set of solar global irradiation spectra distributions. The results obtained enable to reduce all the information of more than 270,000 c.p.d.f.’s in only 6 different clusters that correspond to 6 different c.p.d.f.’s.
Resumo:
On 22 January 2014, the European Commission is expected to publish the proposals for the 2030 Framework for Climate and Energy Policies, which will be discussed and possibly – or maybe, partly – agreed during the 20-21 March 2014 European Council. This is the first comprehensive review of the 2007-09 Climate and Energy Package, which resulted in the so-called ‘20-20-20’ targets by 2020. The principal intention is to define the EU’s climate change and energy policy framework for the next decade and beyond to give investors an adequate amount of predictability if not certainty. This Commentary argues, however, that the ‘2030 Framework’ is not just about predictability; it is also about making the proper adjustments based on the lessons learned and also in response to new issues that have emerged in the interim. The authors ask what the main lessons are and how they should influence the 2030 Framework. Or put differently, what are the conditions that the “2030 Framework” will need to meet in order to offer a viable package for discussion?
Resumo:
Recently, few aspects of the debate surrounding energy have been as divisive as capacity markets. After having given a green light to a capacity remuneration scheme in the UK in 2014, the EU Commission is now considering starting a sector inquiry in several member states. This paper aims at shedding some light on what capacity markets are about and what are the EU-specific implications, arguing that the debate is ill framed within a market context still focused on conventional power generation, and making the case for a coordinated approach to solve the fallacies of the present system.
Resumo:
Electronic text and image data.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.