242 resultados para Arrow
Resumo:
We report a new STAR measurement of the longitudinal double-spin asymmetry A(LL) for inclusive jet production at midrapidity in polarized p+p collisions at a center-of-mass energy of root s = 200 GeV. The data, which cover jet transverse momenta 5 < p(T) < 30 GeV/c, are substantially more precise than previous measurements. They provide significant new constraints on the gluon spin contribution to the nucleon spin through the comparison to predictions derived from one global fit to polarized deep-inelastic scattering measurements. They provide significant new constraints on the gluon spin contribution to the nucleon spin through the comparison to predictions derived from one global fit to polarized deep-inelastic scattering measurements.
Resumo:
Over the last decades, anti-resonant reflecting optical waveguides (ARROW) have been used in different integrated optics applications. In this type of waveguide, light confinement is partially achieved through an anti-resonant reflection. In this work, the simulation, fabrication and characterization of ARROW waveguides using dielectric films deposited by a plasma-enhanced chemical vapor deposition (PECVD) technique, at low temperatures(similar to 300 degrees C), are presented. Silicon oxynitride (SiO(x)N(y)) films were used as core and second cladding layers and amorphous hydrogenated silicon carbide(a-SiC:H) films as first cladding layer. Furthermore, numerical simulations were performed using homemade routines based on two computational methods: the transfer matrix method (TMM) for the determination of the optimum thickness of the Fabry-Perot layers; and the non-uniform finite difference method (NU-FDM) for 2D design and determination of the maximum width that yields single-mode operation. The utilization of a silicon carbide anti-resonant layer resulted in low optical attenuations, which is due to the high refractive index difference between the core and this layer. Finally, for comparison purposes, optical waveguides using titanium oxide (TiO(2)) as the first ARROW layer were also fabricated and characterized.
Resumo:
The graded-fermion algebra and quasispin formalism are introduced and applied to obtain the gl(m\n)down arrow osp(m\n) branching rules for the two- column tensor irreducible representations of gl(m\n), for the case m less than or equal to n(n > 2). In the case m < n, all such irreducible representations of gl(m\n) are shown to be completely reducible as representations of osp(m\n). This is also shown to be true for the case m=n, except for the spin-singlet representations, which contain an indecomposable representation of osp(m\n) with composition length 3. These branching rules are given in fully explicit form. (C) 1999 American Institute of Physics. [S0022-2488(99)04410-2].
Resumo:
Background: A common task in analyzing microarray data is to determine which genes are differentially expressed across two (or more) kind of tissue samples or samples submitted under experimental conditions. Several statistical methods have been proposed to accomplish this goal, generally based on measures of distance between classes. It is well known that biological samples are heterogeneous because of factors such as molecular subtypes or genetic background that are often unknown to the experimenter. For instance, in experiments which involve molecular classification of tumors it is important to identify significant subtypes of cancer. Bimodal or multimodal distributions often reflect the presence of subsamples mixtures. Consequently, there can be genes differentially expressed on sample subgroups which are missed if usual statistical approaches are used. In this paper we propose a new graphical tool which not only identifies genes with up and down regulations, but also genes with differential expression in different subclasses, that are usually missed if current statistical methods are used. This tool is based on two measures of distance between samples, namely the overlapping coefficient (OVL) between two densities and the area under the receiver operating characteristic (ROC) curve. The methodology proposed here was implemented in the open-source R software. Results: This method was applied to a publicly available dataset, as well as to a simulated dataset. We compared our results with the ones obtained using some of the standard methods for detecting differentially expressed genes, namely Welch t-statistic, fold change (FC), rank products (RP), average difference (AD), weighted average difference (WAD), moderated t-statistic (modT), intensity-based moderated t-statistic (ibmT), significance analysis of microarrays (samT) and area under the ROC curve (AUC). On both datasets all differentially expressed genes with bimodal or multimodal distributions were not selected by all standard selection procedures. We also compared our results with (i) area between ROC curve and rising area (ABCR) and (ii) the test for not proper ROC curves (TNRC). We found our methodology more comprehensive, because it detects both bimodal and multimodal distributions and different variances can be considered on both samples. Another advantage of our method is that we can analyze graphically the behavior of different kinds of differentially expressed genes. Conclusion: Our results indicate that the arrow plot represents a new flexible and useful tool for the analysis of gene expression profiles from microarrays.
Resumo:
Microarray allow to monitoring simultaneously thousands of genes, where the abundance of the transcripts under a same experimental condition at the same time can be quantified. Among various available array technologies, double channel cDNA microarray experiments have arisen in numerous technical protocols associated to genomic studies, which is the focus of this work. Microarray experiments involve many steps and each one can affect the quality of raw data. Background correction and normalization are preprocessing techniques to clean and correct the raw data when undesirable fluctuations arise from technical factors. Several recent studies showed that there is no preprocessing strategy that outperforms others in all circumstances and thus it seems difficult to provide general recommendations. In this work, it is proposed to use exploratory techniques to visualize the effects of preprocessing methods on statistical analysis of cancer two-channel microarray data sets, where the cancer types (classes) are known. For selecting differential expressed genes the arrow plot was used and the graph of profiles resultant from the correspondence analysis for visualizing the results. It was used 6 background methods and 6 normalization methods, performing 36 pre-processing methods and it was analyzed in a published cDNA microarray database (Liver) available at http://genome-www5.stanford.edu/ which microarrays were already classified by cancer type. All statistical analyses were performed using the R statistical software.
Resumo:
In the literature on risk, one generally assume that uncertainty is uniformly distributed over the entire working horizon, when the absolute risk-aversion index is negative and constant. From this perspective, the risk is totally exogenous, and thus independent of endogenous risks. The classic procedure is "myopic" with regard to potential changes in the future behavior of the agent due to inherent random fluctuations of the system. The agent's attitude to risk is rigid. Although often criticized, the most widely used hypothesis for the analysis of economic behavior is risk-neutrality. This borderline case must be envisaged with prudence in a dynamic stochastic context. The traditional measures of risk-aversion are generally too weak for making comparisons between risky situations, given the dynamic �complexity of the environment. This can be highlighted in concrete problems in finance and insurance, context for which the Arrow-Pratt measures (in the small) give ambiguous.
Resumo:
We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.
Resumo:
In spatial environments, we consider social welfare functions satisfying Arrow's requirements. i.e., weak Pareto and independence of irrelevant alternatives. When the policy space os a one-dimensional continuum, such a welfare function is determined by a collection of 2n strictly quasi-concave preferences and a tie-breaking rule. As a corrollary, we obtain that when the number of voters is odd, simple majority voting is transitive if and only if each voter's preference is strictly quasi-concave. When the policy space is multi-dimensional, we establish Arrow's impossibility theorem. Among others, we show that weak Pareto, independence of irrelevant alternatives, and non-dictatorship are inconsistent if the set of alternatives has a non-empty interior and it is compact and convex.
Resumo:
Animation