119 resultados para Ephemeral Computation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we examine in detail the implementation, with its associated difficulties, of the Killing conditions and gauge fixing into the variational principle formulation of Bianchi-type cosmologies. We address problems raised in the literature concerning the Lagrangian and the Hamiltonian formulations: We prove their equivalence, make clear the role of the homogeneity preserving diffeomorphisms in the phase space approach, and show that the number of physical degrees of freedom is the same in the Hamiltonian and Lagrangian formulations. Residual gauge transformations play an important role in our approach, and we suggest that Poincaré transformations for special relativistic systems can be understood as residual gauge transformations. In the Appendixes, we give the general computation of the equations of motion and the Lagrangian for any Bianchi-type vacuum metric and for spatially homogeneous Maxwell fields in a nondynamical background (with zero currents). We also illustrate our counting of degrees of freedom in an appendix.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We obtain the next-to-next-to-leading-logarithmic renormalization-group improvement of the spectrum of hydrogenlike atoms with massless fermions by using potential NRQED. These results can also be applied to the computation of the muonic hydrogen spectrum where we are able to reproduce some known double logarithms at O(m¿s6). We compare with other formalisms dealing with logarithmic resummation available in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the singular effects of vanishingly small surface tension on the dynamics of finger competition in the Saffman-Taylor problem, using the asymptotic techniques described by Tanveer [Philos. Trans. R. Soc. London, Ser. A 343, 155 (1993)] and Siegel and Tanveer [Phys. Rev. Lett. 76, 419 (1996)], as well as direct numerical computation, following the numerical scheme of Hou, Lowengrub, and Shelley [J. Comput. Phys. 114, 312 (1994)]. We demonstrate the dramatic effects of small surface tension on the late time evolution of two-finger configurations with respect to exact (nonsingular) zero-surface-tension solutions. The effect is present even when the relevant zero-surface-tension solution has asymptotic behavior consistent with selection theory. Such singular effects, therefore, cannot be traced back to steady state selection theory, and imply a drastic global change in the structure of phase-space flow. They can be interpreted in the framework of a recently introduced dynamical solvability scenario according to which surface tension unfolds the structurally unstable flow, restoring the hyperbolicity of multifinger fixed points.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple model exhibiting a noise-induced ordering transition (NIOT) and a noise-induced disordering transition (NIDT), in which the noise is purely multiplicative, is presented. Both transitions are found in two dimensions as well as in one dimension. We show analytically and numerically that the critical behavior of these two transitions is described by the so called multiplicative noise (MN) universality class. A computation of the set of critical exponents is presented in both d=1 and d=2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a definition of classical differential cross sections for particles with essentially nonplanar orbits, such as spinning ones. We give also a method for its computation. The calculations are carried out explicitly for electromagnetic, gravitational, and short-range scalar interactions up to the linear terms in the slow-motion approximation. The contribution of the spin-spin terms is found to be at best 10-6 times the post-Newtonian ones for the gravitational interaction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

(2+1)-dimensional anti-de Sitter (AdS) gravity is quantized in the presence of an external scalar field. We find that the coupling between the scalar field and gravity is equivalently described by a perturbed conformal field theory at the boundary of AdS3. This allows us to perform a microscopic computation of the transition rates between black hole states due to absorption and induced emission of the scalar field. Detailed thermodynamic balance then yields Hawking radiation as spontaneous emission, and we find agreement with the semiclassical result, including greybody factors. This result also has application to four and five-dimensional black holes in supergravity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Molecular tools may help to uncover closely related and still diverging species from a wide variety of taxa and provide insight into the mechanisms, pace and geography of marine speciation. There is a certain controversy on the phylogeography and speciation modes of species-groups with an Eastern Atlantic-Western Indian Ocean distribution, with previous studies suggesting that older events (Miocene) and/or more recent (Pleistocene) oceanographic processes could have influenced the phylogeny of marine taxa. The spiny lobster genus Palinurus allows for testing among speciation hypotheses, since it has a particular distribution with two groups of three species each in the Northeastern Atlantic (P. elephas, P. mauritanicus and P. charlestoni) and Southeastern Atlantic and Southwestern Indian Oceans (P. gilchristi, P. delagoae and P. barbarae). In the present study, we obtain a more complete understanding of the phylogenetic relationships among these species through a combined dataset with both nuclear and mitochondrial markers, by testing alternative hypotheses on both the mutation rate and tree topology under the recently developed approximate Bayesian computation (ABC) methods. Results Our analyses support a North-to-South speciation pattern in Palinurus with all the South-African species forming a monophyletic clade nested within the Northern Hemisphere species. Coalescent-based ABC methods allowed us to reject the previously proposed hypothesis of a Middle Miocene speciation event related with the closure of the Tethyan Seaway. Instead, divergence times obtained for Palinurus species using the combined mtDNA-microsatellite dataset and standard mutation rates for mtDNA agree with known glaciation-related processes occurring during the last 2 my. Conclusion The Palinurus speciation pattern is a typical example of a series of rapid speciation events occurring within a group, with very short branches separating different species. Our results support the hypothesis that recent climate change-related oceanographic processes have influenced the phylogeny of marine taxa, with most Palinurus species originating during the last two million years. The present study highlights the value of new coalescent-based statistical methods such as ABC for testing different speciation hypotheses using molecular data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[spa] La participación del trabajo en la renta nacional es constante bajo los supuestos de una función de producción Cobb-Douglas y competencia perfecta. En este artículo se relajan estos supuestos y se investiga si el comportamiento no constante de la participación del trabajo en la renta nacional se explica por (i) una elasticidad de sustitución entre capital y trabajo no unitaria y (ii) competencia no perfecta en el mercado de producto. Nos centramos en España y los U.S. y estimamos una función de producción con elasticidad de sustitución constante y competencia imperfecta en el mercado de producto. El grado de competencia imperfecta se mide a través del cálculo del price markup basado en laaproximación dual. Mostramos que la elasticidad de sustitución es mayor que uno en España y menor que uno en los US. También mostramos que el price markup aleja la elasticidad de sustitución de uno, lo aumenta en España, lo reduce en los U.S. Estos resultados se utilizan para explicar la senda decreciente de la participación del trabajo en la renta nacional, común a ambas economías, y sus contrastadas sendas de capital.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a parallel architecture for a motion estimation algorithm. It is well known that image processing requires a huge amount of computation, mainly at low level processing where the algorithms are dealing with a great numbers of data-pixel. One of the solutions to estimate motions involves detection of the correspondences between two images. Due to its regular processing scheme, parallel implementation of correspondence problem can be an adequate approach to reduce the computation time. This work introduces parallel and real-time implementation of such low-level tasks to be carried out from the moment that the current image is acquired by the camera until the pairs of point-matchings are detected

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We evaluate the performance of different optimization techniques developed in the context of optical flow computation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we de- velop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional mul- tilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrec- tional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimiza- tion search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow com- putation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This special issue aims to cover some problems related to non-linear and nonconventional speech processing. The origin of this volume is in the ISCA Tutorial and Research Workshop on Non-Linear Speech Processing, NOLISP’09, held at the Universitat de Vic (Catalonia, Spain) on June 25–27, 2009. The series of NOLISP workshops started in 2003 has become a biannual event whose aim is to discuss alternative techniques for speech processing that, in a sense, do not fit into mainstream approaches. A selected choice of papers based on the presentations delivered at NOLISP’09 has given rise to this issue of Cognitive Computation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Helping behavior is any intentional behavior that benefits another living being or group (Hogg & Vaughan, 2010). People tend to underestimate the probability that others will comply with their direct requests for help (Flynn & Lake, 2008). This implies that when they need help, they will assess the probability of getting it (De Paulo, 1982, cited in Flynn & Lake, 2008) and then they will tend to estimate one that is actually lower than the real chance, so they may not even consider worth asking for it. Existing explanations for this phenomenon attribute it to a mistaken cost computation by the help seeker, who will emphasize the instrumental cost of “saying yes”, ignoring that the potential helper also needs to take into account the social cost of saying “no”. And the truth is that, especially in face-to-face interactions, the discomfort caused by refusing to help can be very high. In short, help seekers tend to fail to realize that it might be more costly to refuse to comply with a help request rather than accepting. A similar effect has been observed when estimating trustworthiness of people. Fetchenhauer and Dunning (2010) showed that people also tend to underestimate it. This bias is reduced when, instead of asymmetric feedback (getting feedback only when deciding to trust the other person), symmetric feedback (always given) was provided. This cause could as well be applicable to help seeking as people only receive feedback when they actually make their request but not otherwise. Fazio, Shook, and Eiser (2004) studied something that could be reinforcing these outcomes: Learning asymmetries. By means of a computer game called BeanFest, they showed that people learn better about negatively valenced objects (beans in this case) than about positively valenced ones. This learning asymmetry esteemed from “information gain being contingent on approach behavior” (p. 293), which could be identified with what Fetchenhauer and Dunning mention as ‘asymmetric feedback’, and hence also with help requests. Fazio et al. also found a generalization asymmetry in favor of negative attitudes versus positive ones. They attributed it to a negativity bias that “weights resemblance to a known negative more heavily than resemblance to a positive” (p. 300). Applied to help seeking scenarios, this would mean that when facing an unknown situation, people would tend to generalize and infer that is more likely that they get a negative rather than a positive outcome from it, so, along with what it was said before, people will be more inclined to think that they will get a “no” when requesting help. Denrell and Le Mens (2011) present a different perspective when trying to explain judgment biases in general. They deviate from the classical inappropriate information processing (depicted among other by Fiske & Taylor, 2007, and Tversky & Kahneman, 1974) and explain this in terms of ‘adaptive sampling’. Adaptive sampling is a sampling mechanism in which the selection of sample items is conditioned by the values of the variable of interest previously observed (Thompson, 2011). Sampling adaptively allows individuals to safeguard themselves from experiences they went through once and turned out to lay negative outcomes. However, it also prevents them from giving a second chance to those experiences to get an updated outcome that could maybe turn into a positive one, a more positive one, or just one that regresses to the mean, whatever direction that implies. That, as Denrell and Le Mens (2011) explained, makes sense: If you go to a restaurant, and you did not like the food, you do not choose that restaurant again. This is what we think could be happening when asking for help: When we get a “no”, we stop asking. And here, we want to provide a complementary explanation for the underestimation of the probability that others comply with our direct help requests based on adaptive sampling. First, we will develop and explain a model that represents the theory. Later on, we will test it empirically by means of experiments, and will elaborate on the analysis of its results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the numerical treatment of the optical flow problem by evaluating the performance of the trust region method versus the line search method. To the best of our knowledge, the trust region method is studied here for the first time for variational optical flow computation. Four different optical flow models are used to test the performance of the proposed algorithm combining linear and nonlinear data terms with quadratic and TV regularization. We show that trust region often performs better than line search; especially in the presence of non-linearity and non-convexity in the model.