996 resultados para Optimized perturbation theory
Resumo:
Establishing CD8(+) T cell cultures has been empirical and the published methods have been largely individual laboratory based. In this study, we optimized culturing conditions and show that IL-2 concentration is the most critical factor for the success of establishing CD8(+) T cell cultures. High IL-2 concentration encouraged T cells to non-specifically proliferate, express a B cell marker, B220, and undergo apoptosis. These cells also lose typical irregular T cell morphology and are incapable of sustaining long-term cultures. Using tetramer and intracellular cytokine assessments, we further demonstrated that many antigen-specific T cells have been rendered nonfunctional when expanded under high IL-2 concentration. When IL-2 is used in the correct range, B220-mediated cell depletion greatly enhanced the success rate of such T cell cultures.
Resumo:
From a theoretical perspective, an extension to the Full Range leadership Theory (FRLT) seems needed. In this paper, we explain why instrumental leadership--a class of leadership includes leader behaviors focusing on task and strategic aspects that are neither values nor exchange oriented--can fulfill this extension. Instrument leadership is composed of four factors: environmental monitoring, strategy formulation and implementation, path-goal facilitation and outcome monitoring; these aspects of leadership are currently not included in any of the FRLT's nine leadership scales (as measured by the MLQ--Multifactor Leadership Questionnaire). We present results from two empirical studies using very large samples from a wide array of countries (N > 3,000) to examine the factorial, discriminant and criterion-related validity of the instrumental leadership scales. We find support for a four-factor instrumental leadership model, which explains incremental variance in leader outcomes in over and above transactional and transformational leadership.
Resumo:
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Resumo:
This paper evaluates the reception of Léon Walras' ideas in Russia before 1920. Despite an unfavourable institutional context, Walras was read by Russian economists. On the one hand, Bortkiewicz and Winiarski, who lived outside Russia and had the opportunity to meet and correspond with Walras, were first class readers and very good ambassadors for Walras' ideas, while on the other, the economists living in Russia were more selective in their readings. They restricted themselves to Walras' Elements of Pure Economics, in particular, its theory of exchange, while ignoring its theory of production. We introduce a cultural argument to explain their selective reading. JEL classification numbers: B 13, B 19.
Resumo:
The knowledge of the relationship that links radiation dose and image quality is a prerequisite to any optimization of medical diagnostic radiology. Image quality depends, on the one hand, on the physical parameters such as contrast, resolution, and noise, and on the other hand, on characteristics of the observer that assesses the image. While the role of contrast and resolution is precisely defined and recognized, the influence of image noise is not yet fully understood. Its measurement is often based on imaging uniform test objects, even though real images contain anatomical backgrounds whose statistical nature is much different from test objects used to assess system noise. The goal of this study was to demonstrate the importance of variations in background anatomy by quantifying its effect on a series of detection tasks. Several types of mammographic backgrounds and signals were examined by psychophysical experiments in a two-alternative forced-choice detection task. According to hypotheses concerning the strategy used by the human observers, their signal to noise ratio was determined. This variable was also computed for a mathematical model based on the statistical decision theory. By comparing theoretical model and experimental results, the way that anatomical structure is perceived has been analyzed. Experiments showed that the observer's behavior was highly dependent upon both system noise and the anatomical background. The anatomy partly acts as a signal recognizable as such and partly as a pure noise that disturbs the detection process. This dual nature of the anatomy is quantified. It is shown that its effect varies according to its amplitude and the profile of the object being detected. The importance of the noisy part of the anatomy is, in some situations, much greater than the system noise. Hence, reducing the system noise by increasing the dose will not improve task performance. This observation indicates that the tradeoff between dose and image quality might be optimized by accepting a higher system noise. This could lead to a better resolution, more contrast, or less dose.
Resumo:
The RuskSkinner formalism was developed in order to give a geometrical unified formalism for describing mechanical systems. It incorporates all the characteristics of Lagrangian and Hamiltonian descriptions of these systems (including dynamical equations and solutions, constraints, Legendre map, evolution operators, equivalence, etc.). In this work we extend this unified framework to first-order classical field theories, and show how this description comprises the main features of the Lagrangian and Hamiltonian formalisms, both for the regular and singular cases. This formulation is a first step toward further applications in optimal control theory for partial differential equations. 2004 American Institute of Physics.
Resumo:
This paper studies a risk measure inherited from ruin theory and investigates some of its properties. Specifically, we consider a value-at-risk (VaR)-type risk measure defined as the smallest initial capital needed to ensure that the ultimate ruin probability is less than a given level. This VaR-type risk measure turns out to be equivalent to the VaR of the maximal deficit of the ruin process in infinite time. A related Tail-VaR-type risk measure is also discussed.
Resumo:
Glutamine has multiple roles in brain metabolism and its concentration can be altered in various pathological conditions. An accurate knowledge of its concentration is therefore highly desirable to monitor and study several brain disorders in vivo. However, in recent years, several MRS studies have reported conflicting glutamine concentrations in the human brain. A recent hypothesis for explaining these discrepancies is that a short T2 component of the glutamine signal may impact on its quantification at long echo times. The present study therefore aimed to investigate the impact of acquisition parameters on the quantified glutamine concentration using two different acquisition techniques, SPECIAL at ultra-short echo time and MEGA-SPECIAL at moderate echo time. For this purpose, MEGA-SPECIAL was optimized for the first time for glutamine detection. Based on the very good agreement of the glutamine concentration obtained between the two measurements, it was concluded that no impact of a short T2 component of the glutamine signal was detected.
Resumo:
A new aggregation method for decision making is presented by using induced aggregation operators and the index of maximum and minimum level. Its main advantage is that it can assess complex reordering processes in the aggregation that represent complex attitudinal characters of the decision maker such as psychological or personal factors. A wide range of properties and particular cases of this new approach are studied. A further generalization by using hybrid averages and immediate weights is also presented. The key issue in this approach against the previous model is that we can use the weighted average and the ordered weighted average in the same formulation. Thus, we are able to consider the subjective attitude and the degree of optimism of the decision maker in the decision process. The paper ends with an application in a decision making problem based on the use of the assignment theory.