949 resultados para subgrid-scale model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new inflationary scenario whose exponential potential V (Phi) has a quadratic dependence on the field Phi in addition to the standard linear term is confronted with the five-year observations of the Wilkinson-Microwave Anisotropy Probe and the Sloan Digital Sky Survey data. The number of e-folds (N), the ratio of tensor-to-scalar perturbations (r), the spectral scalar index of the primordial power spectrum (n(s)) and its running (dn(s)/d ln k) depend on the dimensionless parameter a multiplying the quadratic term in the potential. In the limit a. 0 all the results of the exponential potential are fully recovered. For values of alpha not equal 0, we find that the model predictions are in good agreement with the current observations of the Cosmic Microwave Background (CMB) anisotropies and Large-Scale Structure (LSS) in the Universe. Copyright (C) EPLA, 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new accelerating cosmology driven only by baryons plus cold dark matter (CDM) is proposed in the framework of general relativity. In this scenario the present accelerating stage of the Universe is powered by the negative pressure describing the gravitationally-induced particle production of cold dark matter particles. This kind of scenario has only one free parameter and the differential equation governing the evolution of the scale factor is exactly the same of the Lambda CDM model. For a spatially flat Universe, as predicted by inflation (Omega(dm) + Omega(baryon) = 1), it is found that the effectively observed matter density parameter is Omega(meff) = 1 - alpha, where alpha is the constant parameter specifying the CDM particle creation rate. The supernovae test based on the Union data (2008) requires alpha similar to 0.71 so that Omega(meff) similar to 0.29 as independently derived from weak gravitational lensing, the large scale structure and other complementary observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phylogenetic analyses of chloroplast DNA sequences, morphology, and combined data have provided consistent support for many of the major branches within the angiosperm, clade Dipsacales. Here we use sequences from three mitochondrial loci to test the existing broad scale phylogeny and in an attempt to resolve several relationships that have remained uncertain. Parsimony, maximum likelihood, and Bayesian analyses of a combined mitochondrial data set recover trees broadly consistent with previous studies, although resolution and support are lower than in the largest chloroplast analyses. Combining chloroplast and mitochondrial data results in a generally well-resolved and very strongly supported topology but the previously recognized problem areas remain. To investigate why these relationships have been difficult to resolve we conducted a series of experiments using different data partitions and heterogeneous substitution models. Usually more complex modeling schemes are favored regardless of the partitions recognized but model choice had little effect on topology or support values. In contrast there are consistent but weakly supported differences in the topologies recovered from coding and non-coding matrices. These conflicts directly correspond to relationships that were poorly resolved in analyses of the full combined chloroplast-mitochondrial data set. We suggest incongruent signal has contributed to our inability to confidently resolve these problem areas. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

So Paulo is the most developed state in Brazil and contains few fragments of native ecosystems, generally surrounded by intensive agriculture lands. Despite this, some areas still shelter large native animals. We aimed at understanding how medium and large carnivores use a mosaic landscape of forest/savanna and agroecosystems, and how the species respond to different landscape parameters (percentage of landcover and edge density), in a multi-scale perspective. The response variables were: species richness, carnivore frequency and frequency for the three most recorded species (Puma concolor, Chrysocyon brachyurus and Leopardus pardalis). We compared 11 competing models using Akaike`s information criterion (AIC) and assessed model support using weight of AIC. Concurrent models were combinations of landcover types (native vegetation, ""cerrado"" formations, ""cerrado"" and eucalypt plantation), landscape feature (percentage of landcover and edge density) and spatial scale. Herein, spatial scale refers to the radius around a sampling point defining a circular landscape. The scales analyzed were 250 (fine), 1,000 (medium) and 2,000 m (coarse). The shape of curves for response variables (linear, exponential and power) was also assessed. Our results indicate that species with high mobility, P. concolor and C. brachyurus, were best explained by edge density of the native vegetation at a coarse scale (2,000 m). The relationship between P. concolor and C. brachyurus frequency had a negative power-shaped response to explanatory variables. This general trend was also observed for species richness and carnivore frequency. Species richness and P. concolor frequency were also well explained by a second concurrent model: edge density of cerrado at the fine (250 m) scale. A different response was recorded for L. pardalis, as the frequency was best explained for the amount of cerrado at the fine (250 m) scale. The curve of response was linearly positive. The contrasting results (P. concolor and C. brachyurus vs L. pardalis) may be due to the much higher mobility of the two first species, in comparison with the third. Still, L. pardalis requires habitat with higher quality when compared with other two species. This study highlights the importance of considering multiple spatial scales when evaluating species responses to different habitats. An important and new finding was the prevalence of edge density over the habitat extension to explain overall carnivore distribution, a key information for planning and management of protected areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new lifetime distribution which can handle bathtub-shaped unimodal increasing and decreasing hazard rate functions The model has three parameters and generalizes the exponential power distribution proposed by Smith and Bain (1975) with the inclusion of an additional shape parameter The maximum likelihood estimation procedure is discussed A small-scale simulation study examines the performance of the likelihood ratio statistics under small and moderate sized samples Three real datasets Illustrate the methodology (C) 2010 Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to develop a Bayesian analysis for nonlinear regression models under scale mixtures of skew-normal distributions. This novel class of models provides a useful generalization of the symmetrical nonlinear regression models since the error distributions cover both skewness and heavy-tailed distributions such as the skew-t, skew-slash and the skew-contaminated normal distributions. The main advantage of these class of distributions is that they have a nice hierarchical representation that allows the implementation of Markov chain Monte Carlo (MCMC) methods to simulate samples from the joint posterior distribution. In order to examine the robust aspects of this flexible class, against outlying and influential observations, we present a Bayesian case deletion influence diagnostics based on the Kullback-Leibler divergence. Further, some discussions on the model selection criteria are given. The newly developed procedures are illustrated considering two simulations study, and a real data previously analyzed under normal and skew-normal nonlinear regression models. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In interval-censored survival data, the event of interest is not observed exactly but is only known to occur within some time interval. Such data appear very frequently. In this paper, we are concerned only with parametric forms, and so a location-scale regression model based on the exponentiated Weibull distribution is proposed for modeling interval-censored data. We show that the proposed log-exponentiated Weibull regression model for interval-censored data represents a parametric family of models that include other regression models that are broadly used in lifetime data analysis. Assuming the use of interval-censored data, we employ a frequentist analysis, a jackknife estimator, a parametric bootstrap and a Bayesian analysis for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Furthermore, for different parameter settings, sample sizes and censoring percentages, various simulations are performed; in addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to a modified deviance residual in log-exponentiated Weibull regression models for interval-censored data. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Security administrators face the challenge of designing, deploying and maintaining a variety of configuration files related to security systems, especially in large-scale networks. These files have heterogeneous syntaxes and follow differing semantic concepts. Nevertheless, they are interdependent due to security services having to cooperate and their configuration to be consistent with each other, so that global security policies are completely and correctly enforced. To tackle this problem, our approach supports a comfortable definition of an abstract high-level security policy and provides an automated derivation of the desired configuration files. It is an extension of policy-based management and policy hierarchies, combining model-based management (MBM) with system modularization. MBM employs an object-oriented model of the managed system to obtain the details needed for automated policy refinement. The modularization into abstract subsystems (ASs) segment the system-and the model-into units which more closely encapsulate related system components and provide focused abstract views. As a result, scalability is achieved and even comprehensive IT systems can be modelled in a unified manner. The associated tool MoBaSeC (Model-Based-Service-Configuration) supports interactive graphical modelling, automated model analysis and policy refinement with the derivation of configuration files. We describe the MBM and AS approaches, outline the tool functions and exemplify their applications and results obtained. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Sznajd model (SM) has been employed with success in the last years to describe opinion propagation in a community. In particular, it has been claimed that its transient is able to reproduce some scale properties observed in data of proportional elections, in different countries, if the community structure (the network) is scale-free. In this work, we investigate the properties of the transient of a particular version of the SM, introduced by Bernardes and co-authors in 2002. We studied the behavior of the model in networks of different topologies through the time evolution of an order parameter known as interface density, and concluded that regular lattices with high dimensionality also leads to a power-law distribution of the number of candidates with v votes. Also, we show that the particular absorbing state achieved in the stationary state (or else, the winner candidate), is related to a particular feature of the model, that may not be realistic in all situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a large-scale systematics of charge densities, excitation energies and deformation parameters For hundreds of heavy nuclei The systematics is based on a generalized rotation vibration model for the quadrupole and octupole modes and takes into account second-order contributions of the deformations as well as the effects of finite diffuseness values for the nuclear densities. We compare our results with the predictions of classical surface vibrations in the hydrodynamical approximation. (C) 2010 Elsevier B V All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform an analysis of the electroweak precision observables in the Lee-Wick Standard Model. The most stringent restrictions come from the S and T parameters that receive important tree level and one loop contributions. In general the model predicts a large positive S and a negative T. To reproduce the electroweak data, if all the Lee-Wick masses are of the same order, the Lee-Wick scale is of order 5 TeV. We show that it is possible to find some regions in the parameter space with a fermionic state as light as 2.4-3.5 TeV, at the price of rising all the other masses to be larger than 5-8 TeV. To obtain a light Higgs with such heavy resonances a fine-tuning of order a few per cent, at least, is needed. We also propose a simple extension of the model including a fourth generation of Standard Model fermions with their Lee-Wick partners. We show that in this case it is possible to pass the electroweak constraints with Lee-Wick fermionic masses of order 0.4-1.5 TeV and Lee-Wick gauge masses of order 3 TeV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We construct exact vortex solutions in 3+1 dimensions to a theory which is an extension, due to Gies, of the Skyrme-Faddeev model, and that is believed to describe some aspects of the low energy limit of the pure SU(2) Yang-Mills theory. Despite the efforts in the last decades those are the first exact analytical solutions to be constructed for such type of theory. The exact vortices appear in a very particular sector of the theory characterized by special values of the coupling constants, and by a constraint that leads to an infinite number of conserved charges. The theory is scale invariant in that sector, and the solutions satisfy Bogomolny type equations. The energy of the static vortex is proportional to its topological charge, and waves can travel with the speed of light along them, adding to the energy a term proportional to a U(1) No ether charge they create. We believe such vortices may play a role in the strong coupling regime of the pure SU(2) Yang-Mills theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Bayesian inference approach using Markov Chain Monte Carlo (MCMC) is developed for the logistic positive exponent (LPE) model proposed by Samejima and for a new skewed Logistic Item Response Theory (IRT) model, named Reflection LPE model. Both models lead to asymmetric item characteristic curves (ICC) and can be appropriate because a symmetric ICC treats both correct and incorrect answers symmetrically, which results in a logical contradiction in ordering examinees on the ability scale. A data set corresponding to a mathematical test applied in Peruvian public schools is analyzed, where comparisons with other parametric IRT models also are conducted. Several model comparison criteria are discussed and implemented. The main conclusion is that the LPE and RLPE IRT models are easy to implement and seem to provide the best fit to the data set considered.