924 resultados para The issue of autonomy


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We revisit the issue of considering stochasticity of Grassmannian coordinates in N = 1 superspace, which was analyzed previously by Kobakhidze et al. In this stochastic supersymmetry (SUSY) framework, the soft SUSY breaking terms of the minimal supersymmetric Standard Model (MSSM) such as the bilinear Higgs mixing, trilinear coupling, as well as the gaugino mass parameters are all proportional to a single mass parameter xi, a measure of supersymmetry breaking arising out of stochasticity. While a nonvanishing trilinear coupling at the high scale is a natural outcome of the framework, a favorable signature for obtaining the lighter Higgs boson mass m(h) at 125 GeV, the model produces tachyonic sleptons or staus turning to be too light. The previous analyses took Lambda, the scale at which input parameters are given, to be larger than the gauge coupling unification scale M-G in order to generate acceptable scalar masses radiatively at the electroweak scale. Still, this was inadequate for obtaining m(h) at 125 GeV. We find that Higgs at 125 GeV is highly achievable, provided we are ready to accommodate a nonvanishing scalar mass soft SUSY breaking term similar to what is done in minimal anomaly mediated SUSY breaking (AMSB) in contrast to a pure AMSB setup. Thus, the model can easily accommodate Higgs data, LHC limits of squark masses, WMAP data for dark matter relic density, flavor physics constraints, and XENON100 data. In contrast to the previous analyses, we consider Lambda = M-G, thus avoiding any ambiguities of a post-grand unified theory physics. The idea of stochastic superspace can easily be generalized to various scenarios beyond the MSSM. DOI: 10.1103/PhysRevD.87.035022

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last decade has witnessed two unusually large tsunamigenic earthquakes. The devastation from the 2004 Sumatra Andaman and the 2011 Tohoku-Oki earthquakes (both of moment magnitude >= 9.0) and their ensuing tsunamis comes as a harsh reminder on the need to assess and mitigate coastal hazards due to earthquakes and tsunamis worldwide. Along any given subduction zone, megathrust tsunamigenic earthquakes occur over intervals considerably longer than their documented histories and thus, 2004-type events may appear totally `out of the blue'. In order to understand and assess the risk from tsunamis, we need to know their long-term frequency and magnitude, going beyond documented history, to recent geological records. The ability to do this depends on our knowledge of the processes that govern subduction zones, their responses to interseismic and coseismic deformation, and on our expertise to identify and relate tsunami deposits to earthquake sources. In this article, we review the current state of understanding on the recurrence of great thrust earthquakes along global subduction zones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliable estimates of species density are fundamental to planning conservation strategies for any species; further, it is equally crucial to identify the most appropriate technique to estimate animal density. Nocturnal, small-sized animal species are notoriously difficult to census accurately and this issue critically affects their conservation status, We carried out a field study in southern India to estimate the density of slender loris, a small-sized nocturnal primate using line and strip transects. Actual counts of study individuals yielded a density estimate of 1.61 ha(-1); density estimate from line transects was 1.08 ha(-1); and density estimates varied from 1.06 ha(-1) to 0.59 ha(-1) in different fixed-width strip transects. We conclude that line and strip transects may typically underestimate densities of cryptic, nocturnal primates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An analysis of the retrospective predictions by seven coupled ocean atmosphere models from major forecasting centres of Europe and USA, aimed at assessing their ability in predicting the interannual variation of the Indian summer monsoon rainfall (ISMR), particularly the extremes (i.e. droughts and excess rainfall seasons) is presented in this article. On the whole, the skill in prediction of extremes is not bad since most of the models are able to predict the sign of the ISMR anomaly for a majority of the extremes. There is a remarkable coherence between the models in successes and failures of the predictions, with all the models generating loud false alarms for the normal monsoon season of 1997 and the excess monsoon season of 1983. It is well known that the El Nino and Southern Oscillation (ENSO) and the Equatorial Indian Ocean Oscillation (EQUINOO) play an important role in the interannual variation of ISMR and particularly the extremes. The prediction of the phases of these modes and their link with the monsoon has also been assessed. It is found that models are able to simulate ENSO-monsoon link realistically, whereas the EQUINOO-ISMR link is simulated realistically by only one model the ECMWF model. Furthermore, it is found that in most models this link is opposite to the observed, with the predicted ISMR being negatively (instead of positively) correlated with the rainfall over the western equatorial Indian Ocean and positively (instead of negatively) correlated with the rainfall over the eastern equatorial Indian Ocean. Analysis of the seasons for which the predictions of almost all the models have large errors has suggested the facets of ENSO and EQUINOO and the links with the monsoon that need to be improved for improving monsoon predictions by these models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Managing heat produced by computer processors is an important issue today, especially when the size of processors is decreasing rapidly while the number of transistors in the processor is increasing rapidly. This poster describes a preliminary study of the process of adding carbon nanotubes (CNTs) to a standard silicon paste covering a CPU. Measurements were made in two rounds of tests to compare the rate of cool-down with and without CNTs present. The silicon paste acts as an interface between the CPU and the heat sink, increasing the heat transfer rate away from the CPU. To the silicon paste was added 0.05% by weight of CNTs. These were not aligned. A series of K-type thermocouples was used to measure the temperature as a function of time in the vicinity of the CPU, following its shut-off. An Omega data acquisition system was attached to the thermocouples. The CPU temperature was not measured directly because attachment of a thermocouple would have prevented its automatic shut-off A thermocouple in the paste containing the CNTs actually reached a higher temperature than the standard paste, an effect easily explained. But the rate of cooling with the CNTs was about 4.55% better.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The issue of intermittency in numerical solutions of the 3D Navier-Stokes equations on a periodic box 0, L](3) is addressed through four sets of numerical simulations that calculate a new set of variables defined by D-m(t) = (pi(-1)(0) Omega(m))(alpha m) for 1 <= m <= infinity where alpha(m) = 2m/(4m - 3) and Omega(m)(t)](2m) = L-3 integral(v) vertical bar omega vertical bar(2m) dV with pi(0) = vL(-2). All four simulations unexpectedly show that the D-m are ordered for m = 1,..., 9 such that Dm+1 < D-m. Moreover, the D-m squeeze together such that Dm+1/D-m NE arrow 1 as m increases. The values of D-1 lie far above the values of the rest of the D-m, giving rise to a suggestion that a depletion of nonlinearity is occurring which could be the cause of Navier-Stokes regularity. The first simulation is of very anisotropic decaying turbulence; the second and third are of decaying isotropic turbulence from random initial conditions and forced isotropic turbulence at fixed Grashof number respectively; the fourth is of very-high-Reynolds-number forced, stationary, isotropic turbulence at up to resolutions of 4096(3).

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is no exaggeration to state that the energy crisis is the most serious challenge that we face today. Among the strategies to gain access to reliable, renewable energy, the use of solar energy has clearly emerged as the most viable option. A promising direction in this context is artificial photosynthesis. In this article, we briefly describe the essential features of artificial photosynthesis in comparison with natural photosynthesis and point out the modest success that we have had in splitting water to produce oxygen and hydrogen, specially the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nonequilibrium calculations in the presence of an electric field are usually performed in a gauge, and need to be transformed to reveal the gauge-invariant observables. In this work, we discuss the issue of gauge invariance in the context of time-resolved angle-resolved pump/probe photoemission. If the probe is applied while the pump is still on, one must ensure that the calculations of the observed photocurrent are gauge invariant. We also discuss the requirement of the photoemission signal to be positive and the relationship of this constraint to gauge invariance. We end by discussing some technical details related to the perturbative derivation of the photoemission spectra, which involve processes where the pump pulse photoemits electrons due to nonequilibrium effects.