987 resultados para Resource Constraints


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a heterogeneous network composed of femtocells deployed within a macrocell network is considered, and a quality-of-service (QoS)-oriented fairness metric which captures important characteristics of tiered network architectures is proposed. Using homogeneous Poisson processes, the sum capacities in such networks are expressed in closed form for co-channel, dedicated channel, and hybrid resource allocation methods. Then a resource splitting strategy that simultaneously considers capacity maximization, fairness constraints, and QoS constraints is proposed. Detailed computer simulations utilizing 3GPP simulation assumptions show that a hybrid allocation strategy with a well-designed resource split ratio enjoys the best cell-edge user performance, with minimal degradation in the sum throughput of macrocell users when compared with that of co-channel operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) differ from conventional distributed systems in many aspects. The resource limitation of sensor nodes, the ad-hoc communication and topology of the network, coupled with an unpredictable deployment environment are difficult non-functional constraints that must be carefully taken into account when developing software systems for a WSN. Thus, more research needs to be done on designing, implementing and maintaining software for WSNs. This thesis aims to contribute to research being done in this area by presenting an approach to WSN application development that will improve the reusability, flexibility, and maintainability of the software. Firstly, we present a programming model and software architecture aimed at describing WSN applications, independently of the underlying operating system and hardware. The proposed architecture is described and realized using the Model-Driven Architecture (MDA) standard in order to achieve satisfactory levels of encapsulation and abstraction when programming sensor nodes. Besides, we study different non-functional constrains of WSN application and propose two approaches to optimize the application to satisfy these constrains. A real prototype framework was built to demonstrate the developed solutions in the thesis. The framework implemented the programming model and the multi-layered software architecture as components. A graphical interface, code generation components and supporting tools were also included to help developers design, implement, optimize, and test the WSN software. Finally, we evaluate and critically assess the proposed concepts. Two case studies are provided to support the evaluation. The first case study, a framework evaluation, is designed to assess the ease at which novice and intermediate users can develop correct and power efficient WSN applications, the portability level achieved by developing applications at a high-level of abstraction, and the estimated overhead due to usage of the framework in terms of the footprint and executable code size of the application. In the second case study, we discuss the design, implementation and optimization of a real-world application named TempSense, where a sensor network is used to monitor the temperature within an area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resource specialisation, although a fundamental component of ecological theory, is employed in disparate ways. Most definitions derive from simple counts of resource species. We build on recent advances in ecophylogenetics and null model analysis to propose a concept of specialisation that comprises affinities among resources as well as their co-occurrence with consumers. In the distance-based specialisation index (DSI), specialisation is measured as relatedness (phylogenetic or otherwise) of resources, scaled by the null expectation of random use of locally available resources. Thus, specialists use significantly clustered sets of resources, whereas generalists use over-dispersed resources. Intermediate species are classed as indiscriminate consumers. The effectiveness of this approach was assessed with differentially restricted null models, applied to a data set of 168 herbivorous insect species and their hosts. Incorporation of plant relatedness and relative abundance greatly improved specialisation measures compared to taxon counts or simpler null models, which overestimate the fraction of specialists, a problem compounded by insufficient sampling effort. This framework disambiguates the concept of specialisation with an explicit measure applicable to any mode of affinity among resource classes, and is also linked to ecological and evolutionary processes. This will enable a more rigorous deployment of ecological specialisation in empirical and theoretical studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall prevalence of infertility was estimated to be 3.5-16.7% in developing countries and 6.9-9.3% in developed countries. Furthermore, according to reports from some regions of sub-Saharan Africa, the prevalence rate is 30-40%. The consequences of infertility and how it affects the lives of women in poor-resource settings, particularly in developing countries, has become an important issue to be discussed in reproductive health. In some societies, the inability to fulfill the desire to have children makes life difficult for the infertile couple. In many regions, infertility is considered a tragedy that affects not only the infertile couple or woman, but the entire family. This is a position paper which encompasses a review of the needs of low-income infertile couples, mainly those living in developing countries, regarding access to infertility care, including ART and initiatives to provide ART at low or affordable cost. Information was gathered from the databases MEDLINE, CENTRAL, POPLINE, EMBASE, LILACS, and ICTRP with the key words: infertility, low income, assisted reproductive technologies, affordable cost, low cost. There are few initiatives geared toward implementing ART procedures at low cost or at least at affordable cost in low-income populations. Nevertheless, from recent studies, possibilities have emerged for new low-cost initiatives that can help millions of couples to achieve the desire of having a biological child. It is necessary for healthcare professionals and policymakers to take into account these new initiatives in order to implement ART in resource-constrained settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The topic of environmental sustainability is generating increased concern among business executives, governments, consumers, and management scholars. As these stakeholders struggle with the challenges and opportunities presented by an array of environmental issues, HRM scholars and practitioners alike have been relatively slow to engage in the ongoing discussions and debates. Through this special issue on Green FIRM, we seek to stimulate the field of HRM to expand its role in the pursuit of environmentally sustainable business. In this introduction to the special issue, we first provide an overview of the articles that appear in the special issue. Next we present a detailed discussion of research questions that arise from a consideration of several functional HRM practices, including performance management; training, development, and learning; compensation and rewards; and organizational culture. We conclude by describing opportunities for research at the intersection of strategic HRM and environmental management. If pursued with vigor, research addressing this extensive agenda could begin to establish a healthy field of Green FIRM scholarship.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a re-analysis of the Geneva-Copenhagen survey, which benefits from the infrared flux method to improve the accuracy of the derived stellar effective temperatures and uses the latter to build a consistent and improved metallicity scale. Metallicities are calibrated on high-resolution spectroscopy and checked against four open clusters and a moving group, showing excellent consistency. The new temperature and metallicity scales provide a better match to theoretical isochrones, which are used for a Bayesian analysis of stellar ages. With respect to previous analyses, our stars are on average 100 K hotter and 0.1 dex more metal rich, which shift the peak of the metallicity distribution function around the solar value. From Stromgren photometry we are able to derive for the first time a proxy for [alpha/Fe] abundances, which enables us to perform a tentative dissection of the chemical thin and thick disc. We find evidence for the latter being composed of an old, mildly but systematically alpha-enhanced population that extends to super solar metallicities, in agreement with spectroscopic studies. Our revision offers the largest existing kinematically unbiased sample of the solar neighbourhood that contains full information on kinematics, metallicities, and ages and thus provides better constraints on the physical processes relevant in the build-up of the Milky Way disc, enabling a better understanding of the Sun in a Galactic context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss the dynamics of the Universe within the framework of the massive graviton cold dark matter scenario (MGCDM) in which gravitons are geometrically treated as massive particles. In this modified gravity theory, the main effect of the gravitons is to alter the density evolution of the cold dark matter component in such a way that the Universe evolves to an accelerating expanding regime, as presently observed. Tight constraints on the main cosmological parameters of the MGCDM model are derived by performing a joint likelihood analysis involving the recent supernovae type Ia data, the cosmic microwave background shift parameter, and the baryonic acoustic oscillations as traced by the Sloan Digital Sky Survey red luminous galaxies. The linear evolution of small density fluctuations is also analyzed in detail. It is found that the growth factor of the MGCDM model is slightly different (similar to 1-4%) from the one provided by the conventional flat Lambda CDM cosmology. The growth rate of clustering predicted by MGCDM and Lambda CDM models are confronted to the observations and the corresponding best fit values of the growth index (gamma) are also determined. By using the expectations of realistic future x-ray and Sunyaev-Zeldovich cluster surveys we derive the dark matter halo mass function and the corresponding redshift distribution of cluster-size halos for the MGCDM model. Finally, we also show that the Hubble flow differences between the MGCDM and the Lambda CDM models provide a halo redshift distribution departing significantly from the those predicted by other dark energy models. These results suggest that the MGCDM model can observationally be distinguished from Lambda CDM and also from a large number of dark energy models recently proposed in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss the properties of homogeneous and isotropic flat cosmologies in which the present accelerating stage is powered only by the gravitationally induced creation of cold dark matter (CCDM) particles (Omega(m) = 1). For some matter creation rates proposed in the literature, we show that the main cosmological functions such as the scale factor of the universe, the Hubble expansion rate, the growth factor, and the cluster formation rate are analytically defined. The best CCDM scenario has only one free parameter and our joint analysis involving baryonic acoustic oscillations + cosmic microwave background (CMB) + SNe Ia data yields (Omega) over tilde = 0.28 +/- 0.01 (1 sigma), where (Omega) over tilde (m) is the observed matter density parameter. In particular, this implies that the model has no dark energy but the part of the matter that is effectively clustering is in good agreement with the latest determinations from the large- scale structure. The growth of perturbation and the formation of galaxy clusters in such scenarios are also investigated. Despite the fact that both scenarios may share the same Hubble expansion, we find that matter creation cosmologies predict stronger small scale dynamics which implies a faster growth rate of perturbations with respect to the usual Lambda CDM cosmology. Such results point to the possibility of a crucial observational test confronting CCDM with Lambda CDM scenarios through a more detailed analysis involving CMB, weak lensing, as well as the large-scale structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims. We calculate the theoretical event rate of gamma-ray bursts (GRBs) from the collapse of massive first-generation (Population III; Pop III) stars. The Pop III GRBs could be super-energetic with the isotropic energy up to E(iso) greater than or similar to 10(55-57) erg, providing a unique probe of the high-redshift Universe. Methods. We consider both the so-called Pop III.1 stars (primordial) and Pop III.2 stars (primordial but affected by radiation from other stars). We employ a semi-analytical approach that considers inhomogeneous hydrogen reionization and chemical evolution of the intergalactic medium. Results. We show that Pop III.2 GRBs occur more than 100 times more frequently than Pop III.1 GRBs, and thus should be suitable targets for future GRB missions. Interestingly, our optimistic model predicts an event rate that is already constrained by the current radio transient searches. We expect similar to 10-10(4) radio afterglows above similar to 0.3 mJy on the sky with similar to 1 year variability and mostly without GRBs (orphans), which are detectable by ALMA, EVLA, LOFAR, and SKA, while we expect to observe maximum of N < 20 GRBs per year integrated over at z > 6 for Pop III.2 and N < 0.08 per year integrated over at z > 10 for Pop III.1 with EXIST, and N < 0.2 for Pop III.2 GRBs per year integrated over at z > 6 with Swift.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The kinematic approach to cosmological tests provides direct evidence to the present accelerating stage of the Universe that does not depend on the validity of general relativity, as well as on the matter-energy content of the Universe. In this context, we consider here a linear two-parameter expansion for the decelerating parameter, q(z)=q(0)+q(1)z, where q(0) and q(1) are arbitrary constants to be constrained by the union supernovae data. By assuming a flat Universe we find that the best fit to the pair of free parameters is (q(0),q(1))=(-0.73,1.5) whereas the transition redshift is z(t)=0.49(-0.07)(+0.14)(1 sigma) +0.54-0.12(2 sigma). This kinematic result is in agreement with some independent analyses and more easily accommodates many dynamical flat models (like Lambda CDM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Bayesian mixing models have allowed for the inclusion of uncertainty and prior information in the analysis of trophic interactions using stable isotopes. Formulating prior distributions is relatively straightforward when incorporating dietary data. However, the use of data that are related, but not directly proportional, to diet (such as prey availability data) is often problematic because such information is not necessarily predictive of diet, and the information required to build a reliable prior distribution for all prey species is often unavailable. Omitting prey availability data impacts the estimation of a predator's diet and introduces the strong assumption of consumer ultrageneralism (where all prey are consumed in equal proportions), particularly when multiple prey have similar isotope values. Methodology: We develop a procedure to incorporate prey availability data into Bayesian mixing models conditional on the similarity of isotope values between two prey. If a pair of prey have similar isotope values (resulting in highly uncertain mixing model results), our model increases the weight of availability data in estimating the contribution of prey to a predator's diet. We test the utility of this method in an intertidal community against independently measured feeding rates. Conclusions: Our results indicate that our weighting procedure increases the accuracy by which consumer diets can be inferred in situations where multiple prey have similar isotope values. This suggests that the exchange of formalism for predictive power is merited, particularly when the relationship between prey availability and a predator's diet cannot be assumed for all species in a system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports results from a search for nu(mu) -> nu(e) transitions by the MINOS experiment based on a 7 x 10(20) protons-on-target exposure. Our observation of 54 candidate nu(e) events in the far detector with a background of 49.1 +/- 7.0(stat) +/- 2.7(syst) events predicted by the measurements in the near detector requires 2sin(2)(2 theta(13))sin(2)theta(23) < 0.12(0.20) at the 90% C.L. for the normal (inverted) mass hierarchy at delta(CP) = 0. The experiment sets the tightest limits to date on the value of theta(13) for nearly all values of delta(CP) for the normal neutrino mass hierarchy and maximal sin(2)(2 theta(23)).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For Au + Au collisions at 200 GeV, we measure neutral pion production with good statistics for transverse momentum, p(T), up to 20 GeV/c. A fivefold suppression is found, which is essentially constant for 5 < p(T) < 20 GeV/c. Experimental uncertainties are small enough to constrain any model-dependent parametrization for the transport coefficient of the medium, e. g., <(q) over cap > in the parton quenching model. The spectral shape is similar for all collision classes, and the suppression does not saturate in Au + Au collisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The PHENIX experiment has measured the suppression of semi-inclusive single high-transverse-momentum pi(0)'s in Au+Au collisions at root s(NN) = 200 GeV. The present understanding of this suppression is in terms of energy loss of the parent (fragmenting) parton in a dense color-charge medium. We have performed a quantitative comparison between various parton energy-loss models and our experimental data. The statistical point-to-point uncorrelated as well as correlated systematic uncertainties are taken into account in the comparison. We detail this methodology and the resulting constraint on the model parameters, such as the initial color-charge density dN(g)/dy, the medium transport coefficient <(q) over cap >, or the initial energy-loss parameter epsilon(0). We find that high-transverse-momentum pi(0) suppression in Au+Au collisions has sufficient precision to constrain these model-dependent parameters at the +/- 20-25% (one standard deviation) level. These constraints include only the experimental uncertainties, and further studies are needed to compute the corresponding theoretical uncertainties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present rigorous upper and lower bounds for the momentum-space ghost propagator G(p) of Yang-Mills theories in terms of the smallest nonzero eigenvalue (and of the corresponding eigenvector) of the Faddeev-Popov matrix. We apply our analysis to data from simulations of SU(2) lattice gauge theory in Landau gauge, using the largest lattice sizes to date. Our results suggest that, in three and in four space-time dimensions, the Landau gauge ghost propagator is not enhanced as compared to its tree-level behavior. This is also seen in plots and fits of the ghost dressing function. In the two-dimensional case, on the other hand, we find that G(p) diverges as p(-2-2 kappa) with kappa approximate to 0.15, in agreement with A. Maas, Phys. Rev. D 75, 116004 (2007). We note that our discussion is general, although we make an application only to pure gauge theory in Landau gauge. Our simulations have been performed on the IBM supercomputer at the University of Sao Paulo.